0
我嘗試了幾天,開始與Oozie的Wordount(MapReduce)作業。與正常(CMD:「hadoop jar * .jar mainClass輸入輸出」)作業開始一切正常。當前Oozie的配置是:MapReduce開始的工作被殺死了。爲什麼?
- /ApplicationDIR/lib/WordCount.jar
- /ApplicationDIR/workflow.xml
- /文字-IN
/文字-OUT
workflow.xml
個<action name='wordcount'> <map-reduce> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <prepare> <delete path="${outputDir}" /> </prepare> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> <property> <name>mapred.mapper.class</name> <value>HadoopJobs.wordCound.WordCountMR.Map</value> </property> <property> <name>mapred.reducer.class</name> <value>HadoopJobs.wordCound.WordCountMR.Reduce</value> </property> <property> <name>mapreduce.input.fileinputformat.inputdir</name> <value>${inputDir}</value> </property> <property> <name>mapreduce.output.fileoutputformat.outputdir</name> <value>${outputDir}</value> </property> </configuration> </map-reduce> <ok to='end'/> <error to='kill'/> </action> <kill name='kill'> <message>ERROR: [${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name='end'/>
job.properties
nameNode=hdfs://192.168.1.110:8020
jobTracker=192.168.1.110:8050
queueName=default
oozie.wf.application.path=${nameNode}/tmp/testDIR/wordcount-example/ApplicationDIR
inputDir=hdfs://192.168.1.110:8020/tmp/testDIR/wordcount-example/Text-IN
outputDir=hdfs://192.168.1.110:8020/tmp/testDIR/wordcount-example/Text-OUT
命令:
oozie job -oozie http://192.168.1.110:11000/oozie/ -config job.properties -run
結果:
--UPDATE--
Oozie的日誌: https://docs.google.com/document/d/1BKnv4dSEscRqpzKLhOjUaryveSP3q0454uL_5_xVPdk/edit?usp=sharing
可以共享Oozie的日誌和作業服務器日誌 –
@KSNidhin我已經添加了Oozie日誌。我也檢查他們,但沒有什麼特別的。你能告訴我在哪裏可以找到JobTracker日誌嗎? –
爲此,您必須打開JT UI並搜索JOB Action IF,您將從OOZIE UI自行獲取它,並使用Analysis TAB查看Job Tracker Job,或者可以手動在本地mapred data.dir中搜索JOB ID錯誤的詳細信息的位置 – Deb