2015-11-06 46 views
0

我想要編譯使用以下命令Hadoop的例子,但一個錯誤發生: enter image description hereHadoop的編譯 - 在DFS文件

$ mkdir wordcount_classes 
$ javac -classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes WordCount.java 
$ jar -cvf /usr/joe/wordcount.jar -C wordcount_classes/ . 

Assuming that: 

    /usr/joe/wordcount/input - input directory in HDFS 
    /usr/joe/wordcount/output - output directory in HDFS 

Sample text-files as input: 

$ bin/hadoop dfs -ls /usr/joe/wordcount/input/ 
/usr/joe/wordcount/input/file01 
/usr/joe/wordcount/input/file02 

$ bin/hadoop dfs -cat /usr/joe/wordcount/input/file01 
Hello World Bye World 

$ bin/hadoop dfs -cat /usr/joe/wordcount/input/file02 
Hello Hadoop Goodbye Hadoop 

回答

1

/usr/joe是本地的,因爲你可以看到當你執行ls命令第一行。第二個命令需要在HDFS位置輸入和輸出,但HDFS上不存在/usr/joe。您需要將數據移動到HDFS上,然後執行該命令。例如:

#This creates a folder "wordcount/input" in your HDFS home directory 
hdfs dfs -mkdir -p wordcount/input 
hdfs dfs -put /usr/joe/wordcount/input/* wordcount/input 
+0

好吧......在HDFS目錄我只是文件(數據管理部/名稱節點),並在/ usr/joe的應該是我在HDFS目錄或我能做到嗎? – seso

+0

我建議將數據添加到您的主目錄,您的情況看起來可能是'/ user/justcbuser'。 –