What commands are used to see all jobs running in the hadoop cluster and kill a job in linux?
Answer Posted / Om Shankar
To view all running jobs on Hadoop, use the command: 'hadoop job -list'. To kill a specific job, you first need to get the job ID by running 'hadoop job -list' and then use the command: 'hadoop job -kill <job_id>'. In Linux, these commands should be executed in the Hadoop installation directory or in the path where the Hadoop binary files are located.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers