2017-09-13 112 views
0

我,我遵循以下鏈接並使用sbt eclpise插件在Eclipse中創建了spark scala應用程序。Spark Scala應用程序沒有在eclipse中使用sbt運行

https://www.nodalpoint.com/development-and-deployment-of-spark-applications-with-scala-eclipse-and-sbt-part-1-installation-configuration/

隨後所有的步驟,並能運行SampleApp使用SBT。但是,當我將應用程序移植到eclipse時,我無法運行該應用程序。但可以使用Scala解釋器逐行運行。以下是我在運行應用程序時遇到的錯誤。有什麼想法出錯?

Using Spark's default log4j profile: org/apache/spark/log4j- 
defaults.properties 
17/09/12 22:27:55 INFO SparkContext: Running Spark version 1.6.0 
17/09/12 22:27:56 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable 
17/09/12 22:27:56 ERROR SparkContext: Error initializing SparkContext. 
org.apache.spark.SparkException: A master URL must be set in your 
configuration 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:401) 
at TowerLocator$.main(TowerLocator.scala:11) 
at TowerLocator.main(TowerLocator.scala) 
17/09/12 22:27:56 INFO SparkContext: Successfully stopped SparkContext 
Exception in thread "main" org.apache.spark.SparkException: A master URL 
must be set in your configuration 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:401) 
at TowerLocator$.main(TowerLocator.scala:11) 
at TowerLocator.main(TowerLocator.scala) 

感謝

+0

添加setMaster( 「本地[*]」)添加到您的代碼指定。 – bigdatamann

回答

0

可以選擇啓動Eclipse中的應用程序時指定主URL。

val conf = new SparkConf().setAppName("Sample Application").setMaster("local[*]") 

當從外殼啓動您使用--master參數

+0

我在創建conf時在我的scala應用程序中添加了這個功能。 – SuSri

相關問題