How to get Job and Kill jobs in Hadoop?

This is for Hadoop eco system like HDFS, Map reduce, Hive, Hbase, Pig, sqoop,sqoop2, Avro, solr, hcatalog, impala, Oozie, Zoo Keeper and Hadoop distribution like Cloudera, Hortonwork etc.
dharama123
Posts: 125
Joined: Wed Aug 27, 2014 1:10 am
Contact:

How to get Job and Kill jobs in Hadoop?

Postby dharama123 » Thu Sep 18, 2014 2:34 am

How to get Job and Kill jobs in Hadoop?


Guest

Re: How to get Job and Kill jobs in Hadoop?

Postby Guest » Sat Sep 20, 2014 10:15 pm

The hadoop job command enables you to manage MapReduce jobs. Below two command will help you to get jobid and kill job in Hadoop.
Hadoop job – list
Hadoop job – kill jobID

Syntax
hadoop job [Generic Options]
[-submit <job-file>]
[-status <job-id>]
[-counter <job-id> <group-name> <counter-name>]
[-kill <job-id>]
[-unblacklist <job-id> <hostname>]
[-unblacklist-tracker <hostname>]
[-set-priority <job-id> <priority>]
[-events <job-id> <from-event-#> <#-of-events>]
[-history <jobOutputDir>]
[-list [all]]
[-list-active-trackers]
[-list-blacklisted-trackers]
[-list-attempt-ids <job-id> <task-type> <task-state>]
[-kill-task <task-id>]
[-fail-task <task-id>]
[-blacklist-tasktracker <hostname>]
[-showlabels]


Return to “Hadoop and Big Data”

Who is online

Users browsing this forum: No registered users and 2 guests