2014-02-18 36 views
0

我可以能夠在我的Windows系統上配置的Hadoop-0.19.1和遵循未能在Windows上配置Hadoop的1.2.1

http://ebiquity.umbc.edu/Tutorials/Hadoop/09%20-%20unpack%20hadoop.html

給出的步驟現在我想配置的Hadoop -1.2.1按照相同的步驟,但我在命令stucked

$斌/ Hadoop的NameNode的-format

,它是給我如下圖所示

[email protected] ~ 
$ cd hadoop-1.2.1 

[email protected] ~/hadoop-1.2.1 
$ bin/hadoop namenode -format 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 15: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 19: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 21: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 26: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 30: cd: /home/31 
: No such file or directory 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 32: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 35: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 87: syntax error 
: unexpected end of file 
Error: Could not find or load main class org.apache.hadoop.util.PlatformName 
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode. 
NameNode 

[email protected] ~/hadoop-1.2.1 
$ ssh-host-config 

*** ERROR: There are still ssh processes running. Please shut them down first. 

[email protected] ~/hadoop-1.2.1 
$ ssh-host-config 

*** Query: Overwrite existing /etc/ssh_config file? (yes/no) yes 
*** Info: Creating default /etc/ssh_config file 
*** Query: Overwrite existing /etc/sshd_config file? (yes/no) yes 
*** Info: Creating default /etc/sshd_config file 
*** Info: Privilege separation is set to yes by default since OpenSSH 3.3. 
*** Info: However, this requires a non-privileged account called 'sshd'. 
*** Info: For more info on privilege separation read /usr/share/doc/openssh/READ 
ME.privsep. 
*** Query: Should privilege separation be used? (yes/no) yes 
*** Info: Updating /etc/sshd_config file 
*** Query: Overwrite existing /etc/inetd.d/sshd-inetd file? (yes/no) yes 
*** Info: Creating default /etc/inetd.d/sshd-inetd file 
*** Info: Updated /etc/inetd.d/sshd-inetd 

*** Info: Sshd service is already installed. 

*** Info: Host configuration finished. Have fun! 

[email protected] ~/hadoop-1.2.1 
$ cd .. 

[email protected] ~ 
$ ssh-keygen 
Generating public/private rsa key pair. 
Enter file in which to save the key (/home/313159/.ssh/id_rsa): 
/home/313159/.ssh/id_rsa already exists. 
Overwrite (y/n)? y 
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/313159/.ssh/id_rsa. 
Your public key has been saved in /home/313159/.ssh/id_rsa.pub. 
The key fingerprint is: 
21:45:11:30:6b:3a:af:c6:4a:2d:ed:3a:bf:be:69:be [email protected] 
The key's randomart image is: 
+--[ RSA 2048]----+ 
|  oo=o  | 
|  +   | 
|  + .  | 
|  o . .  | 
| o S  | 
| o o   | 
| o.o .   | 
| ..o+o   | 
| oXE+   | 
+-----------------+ 

[email protected] ~/.ssh 
$ ls -l 
total 12 
-rw-r--r-- 1 313159 mkpasswd 2004 Feb 18 11:59 authorized_keys 
-rw------- 1 313159 mkpasswd 668 Oct 21 12:06 id_dsa 
-rw-r--r-- 1 313159 mkpasswd 605 Oct 21 12:06 id_dsa.pub 
-rw------- 1 313159 mkpasswd 1679 Feb 18 11:57 id_rsa 
-rw-r--r-- 1 313159 mkpasswd 397 Feb 18 11:57 id_rsa.pub 
-rw-r--r-- 1 313159 mkpasswd 171 Oct 21 12:00 known_hosts 

[email protected] ~/.ssh 
$ cd .. 

[email protected] ~ 
$ cd hadoop-1.2.1 

[email protected] ~/hadoop-1.2.1 
$ bin/hadoop namenode -format 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 15: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 19: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 21: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 26: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 30: cd: /home/31 
: No such file or directory 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 32: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 35: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 87: syntax error 
: unexpected end of file 
Error: Could not find or load main class org.apache.hadoop.util.PlatformName 
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode. 
NameNode 

[email protected] ~/hadoop-1.2.1 
$ cd bin 

[email protected] ~/hadoop-1.2.1/bin 
$ dos2unix *.sh 
dos2unix: converting file hadoop-config.sh to Unix format ... 
dos2unix: converting file hadoop-daemon.sh to Unix format ... 
dos2unix: converting file hadoop-daemons.sh to Unix format ... 
dos2unix: converting file slaves.sh to Unix format ... 
dos2unix: converting file start-all.sh to Unix format ... 
dos2unix: converting file start-balancer.sh to Unix format ... 
dos2unix: converting file start-dfs.sh to Unix format ... 
dos2unix: converting file start-jobhistoryserver.sh to Unix format ... 
dos2unix: converting file start-mapred.sh to Unix format ... 
dos2unix: converting file stop-all.sh to Unix format ... 
dos2unix: converting file stop-balancer.sh to Unix format ... 
dos2unix: converting file stop-dfs.sh to Unix format ... 
dos2unix: converting file stop-jobhistoryserver.sh to Unix format ... 
dos2unix: converting file stop-mapred.sh to Unix format ... 

[email protected] ~/hadoop-1.2.1/bin 
$ cd .. 

[email protected] ~/hadoop-1.2.1 
$ cd conf 

[email protected] ~/hadoop-1.2.1/conf 
$ dos2unix *.sh 
dos2unix: converting file hadoop-env.sh to Unix format ... 

[email protected] ~/hadoop-1.2.1/conf 
$ cd .. 

[email protected] ~/hadoop-1.2.1 
$ bin/hadoop namenode -format 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 15: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 19: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 21: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 26: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 30: cd: /home/31 
: No such file or directory 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 32: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 35: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 87: syntax error 
: unexpected end of file 
Error: Could not find or load main class org.apache.hadoop.util.PlatformName 
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode. 
NameNode 

任何人都可以請建議/分享我的解決方案的問題 錯誤謝謝

+0

這個問題似乎是脫離主題,因爲它是一個可能的重複http://stackoverflow.com/questions/12839705/cant-find-or-load-main-class-error-in-hadoop和http: //stackoverflow.com/questions/14852199/hdfs-namenood-formatting-error-could-not-find-the-main-class-namenood-code-a – Chiron

回答

0

的原因可能是在Unix和Windows中的換行符差異。

我認爲答案是運行該文件的dos2unix命令,如this post [可能的重複]中建議的那樣。