2016-01-29 94 views
1

我有以下內容的簡單的斯卡拉對象文件:錯誤而執行Scala的構建與星火1.5.2和Scala 2.11.7

import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 


object X { 
     def main(args: Array[String]) { 

     val params = Map[String, String](
      "abc" -> "22",) 
     println("Creating Spark Configuration"); 
     val conf = new SparkConf().setAppName("X") 
     val sc = new SparkContext(conf) 
     val txtFileLines = sc.textFile("/tmp/x.txt", 2).cache() 
     val count = txtFileLines.count() 
     println("Count" + count) 
    } 
} 

我build.sbt樣子:

name := "x" 

version := "1.0" 

scalaVersion := "2.11.7" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided" 

我然後做SBT包來創建x.jartarget/scala-2.11/

當我執行上面的代碼爲: spark-submit --class X --master local[2] x.jar 我收到以下錯誤:

Creating Spark Configuration 
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object; 
    at Sweeper$.main(Sweeper.scala:14) 
    at Sweeper.main(Sweeper.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
+1

您的火花是2.10還是2.11火花? – Reactormonk

+0

它的火花1.5.2 – Neel

+0

問題是用什麼版本的Scala來構建它。 Scala二進制文件在主要版本(即2.10和2.11)之間不兼容,Spark的默認Scala版本是2.10。 – zero323

回答