我想要運行SparkPi示例時出現此錯誤。主人必須從紗線開始,火花
[email protected]:~/spark-1.2.0-bin-hadoop2.4$ /home/beyhan/spark-1.2.0-bin-hadoop2.4/bin/spark-submit --master ego-client --class org.apache.spark.examples.SparkPi /home/beyhan/spark-1.2.0-bin-hadoop2.4/lib/spark-examples-1.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Error: Master must start with yarn, spark, mesos, or local
Run with --help for usage help or --verbose for debug output
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
而且我已經通過其他終端
>./sbin/start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /home/beyhan/spark-1.2.0-bin-hadoop2.4/sbin/../logs/spark-beyhan-org.apache.spark.deploy.master.Master-1-beyhan.out
任何建議,開始我的主人? 謝謝。
我試圖/home/beyhan/spark-1.2.0-bin-hadoop2.4/bin/spark-submit --master當地--class org.apache。 spark.examples.SparkPi /home/beyhan/spark-1.2.0-bin-hadoop2.4/lib/spark-examples-1.jar這個,但得到另一個錯誤警告:本地jar /home/beyhan/spark-1.2.0 -bin-hadoop2.4/lib/spark-examples-1.jar不存在,跳過。 –
您確定該文件夾中有spark-examples-1.jar嗎?它看起來沒有。 – drstein
是的,我沒有任何jar文件。我如何從我的Scala代碼創建jar文件? –