What commands are used to see all jobs running in the hadoop cluster and kill a job in linux?
Answer / Om Shankar
To view all running jobs on Hadoop, use the command: 'hadoop job -list'. To kill a specific job, you first need to get the job ID by running 'hadoop job -list' and then use the command: 'hadoop job -kill <job_id>'. In Linux, these commands should be executed in the Hadoop installation directory or in the path where the Hadoop binary files are located.
| Is This Answer Correct ? | 0 Yes | 0 No |
how to see unallocated hard disk space on linux
What is the difference between cd and cd in linux?
What can you tell about the tar command?
What are the ms dos commands?
How can you copy lines into the buffer in command mode?
What is gnu in linux?
How many interfaces(ethernet cards) are working using single command?
How do you kill a process in linux?
What are vim commands?
What Command used to lock user password?
What are grep patterns called?
What is bc command in unix?