2012-08-04 3 views
1

我在Linux集羣中安裝了Hadoop。當我嘗試通過在命令 $斌/ start-all.sh啓動服務器,我獲得以下錯誤:hadoop守護進程沒有啓動

mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied 
chown: cannot access `/var/log/hadoop/spuri2': No such file or directory 
/home/spuri2/spring_2012/Hadoop/hadoop/hadoop-1.0.2/bin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-spuri2-namenode.pid: Permission denied 
head: cannot open `/var/log/hadoop/spuri2/hadoop-spuri2-namenode-gpu02.cluster.out' for reading: No such file or directory 
localhost: /home/spuri2/.bashrc: line 10: /act/Modules/3.2.6/init/bash: No such file or directory 
localhost: mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied 
localhost: chown: cannot access `/var/log/hadoop/spuri2': No such file or directory 

我在CONF配置日誌目錄參數/ hadoop-env.sh到/ tmp目錄目錄,並且我已將core-site.xml中的「hadoop.tmp.dir」配置到/ tmp /目錄。由於我無法訪問/ var/log目錄,但hadoop守護進程仍在嘗試寫入/ var/log目錄並失敗。

我想知道爲什麼會發生這種情況?

回答

1

你必須在「core.site.xml」文件在此目錄中hadoop-env.sh

<?xml version="1.0"?> 
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?> 

<!-- Put site-specific property overrides in this file. --> 

<configuration> 
<property> 
    <name>hadoop.tmp.dir</name> 
    <value>/Directory_hadoop_user_have_permission/temp/${user.name}</value> 
    <description>A base for other temporary directories.</description> 
</property> 

<property> 
    <name>fs.default.name</name> 
    <value>hdfs://localhost:54310</value> 
    <description>The name of the default file system. A URI whose 
    scheme and authority determine the FileSystem implementation. The 
    uri's scheme determines the config property (fs.SCHEME.impl) naming 
    the FileSystem implementation class. The uri's authority is used to 
    determine the host, port, etc. for a filesystem.</description> 
</property> 

</configuration> 
+1

我試過,太多,但沒有任何變化。 – 2012-08-04 02:21:00

+0

你的.bashrc文件中變量$ HADOOP_HOME的位置是什麼? – 2012-08-04 02:49:02

+0

我的.bashrc文件中沒有$ HADOOP_HOME的條目,因爲我沒有權限,所以無法編輯該文件。我所做的就是使用export命令設置環境變量HADOOP_HOME。但是,它也不起作用。 – 2012-08-04 03:08:37

0

總之不寫,我遇到這個問題,因爲有在大學的Hadoop多個安裝簇。以root用戶身份安裝hadoop已經搞亂了我的本地hadoop安裝。

Hadoop守護進程未啓動的原因是因爲它無法寫入某些具有超級用戶權限的文件。我以普通用戶身份運行Hadoop。出現問題是因爲我們大學的系統管理員以root用戶身份安裝了Hadoop,所以當我開始本地安裝hadoop時,根安裝配置文件優先於本地hadoop配置文件。花了很長時間才弄清楚這個問題,但是以root用戶身份卸載hadoop後,問題得到解決。