2017-08-06 86 views
0

我正在嘗試一個簡單的工作計數程序使用火花,但它失敗時,我嘗試初始化火花上下文。 下面是我的代碼無法初始化使用java的火花上下文

conf = new SparkConf(true). 
       setAppName("WordCount"). 
       setMaster("spark://192.168.0.104:7077"); 

     sc = new JavaSparkContext(conf); 

現在幾件事情我想澄清我使用的Spark版本2.1.1,我的Java代碼在Windows 10和我的服務器上的虛擬機運行框中。 我禁用了VM中的防火牆,並且可以從Windows訪問URL http://192.168.0.104:8080/

但是我運行的代碼

17/08/06 18:44:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.103:4040 
17/08/06 18:44:15 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.0.104:7077... 
17/08/06 18:44:15 INFO TransportClientFactory: Successfully created connection to /192.168.0.104:7077 after 41 ms (0 ms spent in bootstraps) 
17/08/06 18:44:15 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.0.104:7077 
org.apache.spark.SparkException: Exception thrown in awaitResult 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 

時得到以下堆棧跟蹤我可不可以有一個人幫助嗎?

回答

0

您需要將一些Spark類導入到您的程序中。添加以下行:

import org.apache.spark.api.java.JavaSparkContext 
import org.apache.spark.api.java.JavaRDD 
import org.apache.spark.SparkConf 

SparkConf conf = new SparkConf().setAppName("WordCount").setMaster("local"); 
JavaSparkContext sc = new JavaSparkContext(conf);