2016-11-30 140 views
3

我正嘗試使用Java程序從本地系統連接到Spark主節點(遠程羣集節點)。我正在使用以下API連接:遠程連接到Spark羣集

SparkConf conf = new SparkConf().setAppName("WorkCountApp").setMaster("spark://masterIP:7077"); 
JavaSparkContext sc = new JavaSparkContext(conf); 

我的程序嘗試連接到主設備,但一段時間後失敗。下面是堆棧跟蹤:

16/11/30 17:40:26 INFO AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master... 
    16/11/30 17:40:46 ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 
    16/11/30 17:40:46 WARN SparkDeploySchedulerBackend: Application ID is not initialized yet. 
    16/11/30 17:40:46 INFO SparkUI: Stopped Spark web UI at http://172.31.11.1:4040 
    16/11/30 17:40:46 INFO DAGScheduler: Stopping DAGScheduler 
    16/11/30 17:40:46 INFO SparkDeploySchedulerBackend: Shutting down all executors 
    16/11/30 17:40:46 INFO SparkDeploySchedulerBackend: Asking each executor to shut down 
    16/11/30 17:40:46 ERROR OneForOneStrategy: 
    java.lang.NullPointerException 
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext 
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103) 

請幫我用同樣

回答

0

有很多原因的連接失敗。然而,對於這個,看起來沒有工作線程已經爲這個Spark主實例化。

在遠程機器上,您需要啓動火花主機以及火花工(從機)