2017-02-03 109 views
5

我用this code初始化錯誤SparkContext:主URL必須在配置

我的錯誤是設置:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 

17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0 

17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable 

17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction 
settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and 
storage memory management are unified. All memory fractions used in the old 
model are now deprecated and no longer read. If you wish to use the old 
memory management, you may explicitly enable `spark.memory.useLegacyMode` 
(not recommended). 

17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext. 

org.apache.spark.SparkException: A master URL must be set in your 
configuration 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379) 
at PCA$.main(PCA.scala:26) 
at PCA.main(PCA.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) 

17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext 
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379) 
at PCA$.main(PCA.scala:26) 
at PCA.main(PCA.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at 
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 

at java.lang.reflect.Method.invoke(Method.java:498) 
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) 

Process finished with exit code 1 
+0

的可能的複製[星火 - 錯誤提交應用時「的主URL必須在配置中設置」(https://stackoverflow.com/questions/38008330/spark-error-a-master-url-must-set-in-your-configuration-when-submitting-a) –

回答

4

錯誤信息是很清楚的,你必須提供星火法師地址節點,無論是通過SparkContext或通過​​:

val conf = 
    new SparkConf() 
    .setAppName("ClusterScore") 
    .setMaster("spark://172.1.1.1:7077") // <--- This is what's missing 
    .set("spark.storage.memoryFraction", "1") 

val sc = new SparkContext(conf) 
+2

謝謝,它有效。 – fakherzad

+0

現在我有關於此代碼的另一個問題:我如何輸入我的文本。 insted of「/data/kddcupdata/kddcup.trasfrom.nou2r」我想用我的文本文件保存在「C://kddcup.data_10_percent_corrected.txt」中。 PLZ幫助我怎麼能做到這一點? – fakherzad

+0

@fakherzad您可以使用'file:/// kddcup.data_10_percent_corrected.txt'來讀取本地機器上的文件。 –

5

如果您運行的火花獨立則

val conf = new SparkConf().setMaster("spark://master") //missing 

,你可以傳遞參數,同時提交工作

spark-submit --master spark://master 

如果您運行的火花地方然後

val conf = new SparkConf().setMaster("local[2]") //missing 

你可以傳遞參數,同時提交工作

spark-submit --master local 

如果你在紗線上運行火花,那麼

spark-submit --master yarn 
1
SparkConf configuration = new SparkConf() 
      .setAppName("Your Application Name") 
      .setMaster("local"); 
val sc = new SparkContext(conf); 

它將工作...

相關問題