2013-11-29 69 views
0

我一直在玩Hadoop和它的姊妹項目,我一直有一些問題,但我終於擊中了一個,我不能找到答案:Hive MapReduce作業提交失敗「目標是一個目錄」

我有一個hive表存儲在hdfs作爲製表符分隔的文本文件。我可以做餐桌上的基本選擇,但是當我使查詢稍微複雜一點,蜂巢把它變成一個映射精簡工作,這失敗與以下堆棧跟蹤

13/11/29 08:31:00 ERROR security.UserGroupInformation:PriviledgedActionException as:hduser(auth:SIMPLE)cause:java.io.IOException:Target/tmp/hadoop->> yarn/staging/hduser/.staging/job_1385633903169_0013/libjars/lib/lib是一個目錄 13/11/29 08:31:00錯誤security.UserGroupInformation:PriviledgedActionException as:hduser(auth:SIMPLE)cause:java.io.IOException:Target/tmp/hadoop-yarn/staging/hduser /。 staging/job_1385633903169_0013/libjars/lib/lib是一個目錄 java.io.IOException:Target /tmp/hadoop-yarn/staging/hduser/.staging/job_1385633903169_0013/libjars/lib/lib是目錄 ,位於org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:500) at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:502) at org.apache .hadoop.fs.FileUtil.copy(FileUtil.java:348) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338) at org.apache.hadoop.fs.FileUtil.copy(FileUtil的.java:289) 在org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:139) 在org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:212) 在org.apache .hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:300) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:387) 在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1268) at org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1265) at java.security.AccessController.doPrivileged (Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop。 (org.apache.hadoop.mapred.JobClient)$ 1.run(JobClient.java:562) at org.apache.hadoop.mapred.JobClient $ 1.run(JobClient。 java:557) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupI nformation.doAs(UserGroupInformation.java:1491) 在org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557) 在org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java: 144) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192) at org.apache。 hadoop.hive.ql.Driver.runInternal(Driver.java:1020) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache。 hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413) at org.apache.hadoop.hive.cli。 CliDriver.executeDriver(clidriver中。java:781) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614) at sun.reflect.NativeMethodAccessorImpl.invoke0(本機方法) 在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在java.lang.reflect中.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) 作業提交失敗,異常爲'java.io.IOException(Target/tmp/hadoop- yarn/staging/hduser/.staging/job_1385633903169_0013/libjars/lib/lib是一個目錄)' 13/11/29 08:31:00錯誤exec.Task:作業提交失敗'異常'java.io.IOException(Target /tmp/hadoop-yarn/staging/hduser/.staging/job_1385633903169_0013/libjars/lib/lib是一個目錄)' java.io.IOException:Target/tmp/hadoop-yarn /staging/hduser/.staging/job_1385633903169_0013/libjars/lib/lib是位於org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:500) 處的目錄 ,位於org.apache.hadoop.fs.FileUtil。 (org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289) at org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:139) at org.apache.hadoop.mapreduce.JobSubmitter。 copyAndConfigureFiles(JobSubmitter.java:212) 在org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:300) 在org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:387) 在org.apache.hadoop.mapreduce。作業$ 10.run(Job.java:1268) at org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1265) at java.security.AccessController.doPrivileged(Native Method) at javax.security .auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.Job.submit(Job.java :1265) at org.apache.hadoop.mapred.JobClient $ 1.run(JobClient.java:562) at org.apache.hadoop.mapred.JobClient $ 1.run(Job Client.java:557) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1491) 在org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557) 在org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548) 在有機.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:144) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192) at org.apache。 hadoop.hive.ql.Driver.runInternal(Driver.java:1020) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888) at org.apache.hadoop.hive.cli。 CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver。 java:413) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) at org.apache.hadoop.hive.cli.CliDriver.main(clidriver中。的java:614) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本機方法) 在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在java.lang.reflect.Method.invoke(Method.java:606) 在org.apache.hadoop.util.RunJar.main(RunJar.java:212)

在考慮中的文件夾中確實存在上dfs,至少是「/ tmp/hadoop-yarn/staging」部分,無論我設置它的權限,配置單元或hadoop都會在作業提交時重置它們。真正令人關注的部分是,完整路徑似乎是一個生成的文件夾名稱,那麼爲什麼軟件自己生成的某些東西有問題?爲什麼這個路徑是一個目錄的問題?它應該是什麼?

編輯: 這是我的工作表,我試圖運行查詢:查詢 : select * from hive_flow_details where node_id = 100 limit 10;

表:

COL_NAME DATA_TYPE評論 ID BIGINT無
flow_versions_id int無
node_id int無
node_name字符串無

請記住,這種情況發生在我嘗試使用任何種類的where子句的任何uery上,因爲hive會將它轉換爲MR作業。

+0

你可以發佈你試圖運行的查詢嗎? –

回答

0

我最終解決了這個問題。我在我清理過的classpath中發現了衝突的瓶子,從那時起我沒有任何問題。

相關問題