2015-04-02 84 views
1

可以配置Spark以便替代端口7077綁定到地址127.0.1.1,而不是將綁定爲0.0.0.0的 。在爲8080端口綁定同樣的方式:更改在端口7077上運行的綁定IP - Apache Spark

netstat -pln 
(Not all processes could be identified, non-owned process info 
will not be shown, you would have to be root to see it all.) 
Active Internet connections (only servers) 
Proto Recv-Q Send-Q Local Address   Foreign Address   State  PID/Program name 
tcp  0  0 127.0.1.1:7077   0.0.0.0:*    LISTEN  2864/java 
tcp  0  0 0.0.0.0:8080   0.0.0.0:*    LISTEN  2864/java 
tcp  0  0 127.0.1.1:6066   0.0.0.0:*    LISTEN  2864/java 
tcp  0  0 0.0.0.0:22    0.0.0.0:*    LISTEN  - 
udp  0  0 0.0.0.0:68    0.0.0.0:*       - 
udp  0  0 192.168.192.22:123  0.0.0.0:*       - 
udp  0  0 127.0.0.1:123   0.0.0.0:*       - 
udp  0  0 0.0.0.0:123    0.0.0.0:*       - 
udp  0  0 0.0.0.0:21415   0.0.0.0:*       - 
Active UNIX domain sockets (only servers) 
Proto RefCnt Flags  Type  State   I-Node PID/Program name Path 
unix 2  [ ACC ]  STREAM  LISTENING  7195  -     /var/run/dbus/system_bus_socket 
unix 2  [ ACC ]  SEQPACKET LISTENING  405  -     /run/udev/control 

原因,我問這是我無法工人連接到主節點和我認爲這個問題是主IP不被發現。

錯誤時,試圖從站連接到主:

15/04/02 21:58:18 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected]:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: Connection refused: raspberrypi/192.168.192.22:7077 
15/04/02 21:58:18 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef: Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from Actor[akka://sparkWorker/user/Worker#1677101765] to Actor[akka://sparkWorker/deadLetters] was not delivered. [10] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 

回答

3

在spark-env.sh你可以設置SPARK_MASTER_IP=<ip>

主機名也可以正常工作(通過SPARK_STANDALONE_MASTER=<hostname>),只要確保工作人員連接到與主機綁定的主機名完全相同(即Spark主UI中顯示的spark://地址)。

+0

你的意思是加入這種格式(不包括引號):「SPARK_MASTER_IP 」? – 2015-04-02 21:41:55

+0

ive嘗試使用「./bin/spark-class org.apache.spark.deploy.worker.Worker spark://192.168.192.22:7077」手動設置主機名,但由於master不接受連接到此IP的連接,因爲可以看到有問題,它不會連接 – 2015-04-02 21:44:56

+0

非常感謝你。我在主節點上添加了「export SPARK_STANDALONE_MASTER = raspberrypi export SPARK_MASTER_IP = 192.168.192.22」到/conf/start-env.sh。 Spark然後註冊的主節點連接到UI上作爲「URL:spark://192.168.192.22:7077」使用命令./bin/spark-class org.apache.spark.deploy.worker.Worker spark://192.168.192.22 :7077 連接到工作人員的主人。所以看起來不錯。再次感謝,這真的讓我越來越好。不知道我是否需要start-env.sh中的兩個參數,但會保持原樣。謝謝 – 2015-04-02 22:32:33