2017-03-06 58 views
0

在我的本地scala-app中,我想在我的羣集中啓動Spark任務。任務級是my.spark.SparkRunner和it's包含在一個罐子裏這是在HDFS,這就是從來就在我的本地程序配置:使用SparkLauncher運行Spark-Task

val spark = new SparkLauncher() 
    //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4") 
    .setVerbose(true) 
    .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar") 
    .setMainClass("my.spark.SparkRunner") 
    .setMaster("spark://192.168.10.183:7077") 
    //.setMaster("192.168.10.183:7077") 
    .launch(); 

spark.waitFor(); 

它不引發錯誤,但回報立即並沒有啓動任務。我究竟做錯了什麼?謝謝...

回答

0

從來就只是添加了檢查發射器的狀態,這就是它的線程...

val spark = new SparkLauncher() 
    //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4") 
    .setVerbose(true) 
    .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar") 
    .setMainClass("my.spark.SparkRunner") 
    .setMaster("spark://192.168.10.183:7077") 
    //.setMaster("192.168.10.183:7077") 
    .startApplication(); 

while (spark.getState.toString != "FINISHED") { 

    println (spark.getState) 

    Thread.sleep(1000) 
}