2016-09-14 132 views
1

我使用的是Window 10,Scala 2.10.2,Spark 1.6.1和Java 1.8。 下面是我試圖運行的代碼。錯誤SparkContext:初始化SparkContext時出錯.. IntelliJ和Scala

import org.apache.spark.SparkContext 
import org.apache.spark.SparkConf 

object WordsCount { 
    def main(args: Array[String]): Unit = { 
    val sparkConf = new SparkConf().setAppName("SparkWordCount").setMaster("local[2]") 
    val sc = new SparkContext(sparkConf) 
    val text = sc.textFile("shortTwitter.txt") 
    val counts = text.flatMap(line=>line.split(" ")).map(word=>(word,1)).reduceByKey(_+_) 
    counts.foreach(println) 
} 
} 

我得到以下錯誤。

"C:\Program Files\Java\jdk1.8.0_101\bin\java" -Didea.launcher.port=7533 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.2.3\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program  Files\Java\jdk1.8.0_101\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_101\jre\lib\rt.jar;C:\Users\Administrator\IdeaProjects\choi_jieun_assignment1\out\production\choi_jieun_assignment1;C:\scala\lib\jline.jar;C:\scala\lib\scalap.jar;C:\scala\lib\diffutils.jar;C:\scala\lib\akka-actors.jar;C:\scala\lib\scala-swing.jar;C:\scala\lib\scala-actors.jar;C:\scala\lib\scala-library.jar;C:\scala\lib\scala-partest.jar;C:\scala\lib\scala-reflect.jar;C:\scala\lib\scala-compiler.jar;C:\scala\lib\typesafe-config.jar;C:\scala\lib\scala-actors-migration.jar;C:\Users\Administrator\.ivy2\cache\org.scala-lang\scala-library\jars\scala-library-2.10.2.jar;C:\Users\Administrator\.ivy2\cache\org.scala-lang\scala-reflect\jars\scala-reflect-2.10.2.jar;C:\spark\lib\datanucleus-core-3.2.10.jar;C:\spark\lib\datanucleus-rdbms-3.2.9.jar;C:\spark\lib\spark-1.6.1-yarn-shuffle.jar;C:\spark\lib\datanucleus-api-jdo-3.2.6.jar;C:\spark\lib\spark-assembly-1.6.1-hadoop2.4.0.jar;C:\spark\lib\spark-examples-1.6.1-hadoop2.4.0.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.2.3\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain WordsCount 

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/C:/spark/lib/spark-assembly-1.6.1-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/C:/spark/lib/spark-examples-1.6.1-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
16/09/14 16:03:21 INFO SparkContext: Running Spark version 1.6.1 
16/09/14 16:03:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/09/14 16:03:22 INFO SecurityManager: Changing view acls to: Administrator 
16/09/14 16:03:22 INFO SecurityManager: Changing modify acls to: Administrator 
16/09/14 16:03:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Administrator); users with modify permissions: Set(Administrator) 
16/09/14 16:03:23 INFO Utils: Successfully started service 'sparkDriver' on port 50762. 
16/09/14 16:03:23 ERROR SparkContext: Error initializing SparkContext. 
java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init> (java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream,  akka.actor.Scheduler, akka.actor.DynamicAccess) 
at java.lang.Class.getConstructor0(Class.java:3082) 
at java.lang.Class.getDeclaredConstructor(Class.java:2178) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:77) 
at scala.util.Try$.apply(Try.scala:161) 
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:74) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85) 
at scala.util.Success.flatMap(Try.scala:200) 
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:85) 
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:546) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104) 
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) 
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) 
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52) 
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1988) 
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1979) 
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55) 
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266) 
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193) 
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288) 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:457) 
at WordsCount$.main(WordsCount.scala:7) 
at WordsCount.main(WordsCount.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
16/09/14 16:03:23 INFO SparkContext: Successfully stopped SparkContext 
Exception in thread "main" java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init>(java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream, akka.actor.Scheduler, akka.actor.DynamicAccess) 
at java.lang.Class.getConstructor0(Class.java:3082) 
at java.lang.Class.getDeclaredConstructor(Class.java:2178) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:77) 
at scala.util.Try$.apply(Try.scala:161) 
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:74) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85) 
at scala.util.Success.flatMap(Try.scala:200) 
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:85) 
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:546) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104) 
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) 
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) 
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52) 
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1988) 
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1979) 
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55) 
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266) 
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193) 
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288) 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:457) 
at WordsCount$.main(WordsCount.scala:7) 
at WordsCount.main(WordsCount.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 

Process finished with exit code 1 

我一直在掙扎幾個小時來解決這個問題,但無法弄清楚。有人可以幫我嗎?謝謝!

+0

你是什麼阿卡版本? – Samar

回答

1

嘗試安裝Java 7 JDK,你可能會有更好的運氣。雖然斯卡拉2.10.2版本說明似乎丟失,2.10.4不支持Java 8:

新的字節代碼發射器基於ASM

可以針對JDK 1.5,1.6和1.7

發出1.6字節碼默認

老1.5後端已被棄用

...甚至2.11.x只有實驗的Java 8支持:

斯卡拉2.11.x系列目標Java 6中,與(演進)爲Java 8.實驗支持在2.11.1,爪哇8支持大多侷限於讀取的Java字節碼8並解析Java 8源代碼。請繼續關注更完整(實驗性)的Java 8支持。下一個主要版本2.12將很可能默認爲Java 8。

另見:Can I use java 8 in an mixed scala 2.10/java project built by sbt?

+1

非常真實。 2.10.x和Java 8並沒有很好地結合在一起。此外,Spark 1.6.1默認編譯爲2.10.x。換句話說,如果你想要Java 8,你需要去Scala 2.11.5+和Spark 2.0。 – marios

+0

非常感謝你!我一直在努力這麼久,你的提示作品!!!!!!!真的很感謝 – tobby

+0

高興地幫助:)。對於Java生態系統來說,我有點新鮮感,但我仍然爲同樣的問題而苦苦掙扎,浪費了幾個小時。 –

相關問題