2017-09-26 122 views
8

我剛剛在我的新機器上安裝了spark並在使用自制軟件安裝Java,Scala和Apache-spark後出現以下錯誤。安裝過程如下:在Mac上Spark Shell「無法初始化編譯器」錯誤

$ brew cask install java 
$ brew install scala 
$ brew install apache-spark 

一旦安裝後,當我嘗試使用spark-shell運行一個基本的例子,我碰到下面的錯誤。任何幫助不勝感激。

$ spark-shell 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 

Failed to initialize compiler: object java.lang.Object in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programmatically, settings.usejavacp.value = true. 

Failed to initialize compiler: object java.lang.Object in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programmatically, settings.usejavacp.value = true. 
Exception in thread "main" java.lang.NullPointerException 
    at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256) 
    at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896) 
    at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895) 
    at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895) 
    at scala.tools.nsc.interpreter.IMain$Request.headerPreamble(IMain.scala:895) 
    at scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:918) 
    at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1337) 
    at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1336) 
    at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64) 
    at scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1336) 
    at scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:908) 
    at scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:1002) 
    at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:997) 
    at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:579) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:98) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
    at org.apache.spark.repl.Main$.doMain(Main.scala:70) 
    at org.apache.spark.repl.Main$.main(Main.scala:53) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.base/java.lang.reflect.Method.invoke(Method.java:564) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)` 
+1

可以卸載'scala'包並重新開始?它不需要Spark,因爲它帶有必要的庫/罐子。該錯誤看起來像Scala版本之間的版本不匹配(和Spark剛剛顯示它)。 –

回答

21

Spark是與Java 9,這是版本brew cask install java將安裝,如果它是最新的不兼容。如果你沒有安裝Java 9,你需要做的是安裝Java 8而不是什麼:

brew cask uninstall java 
brew tap caskroom/versions 
brew cask search java 
brew cask install java8 
+1

主席先生,你應該得到一枚勳章:) –

+0

即使這不起作用,我還能嘗試什麼? :( – user3768495

0

win10: 你必須轉換爲jdk8: 設置JAVA_HOME = jdk8; 從路徑C:\ ProgramData \ Oracle \ Java \ javapath中放棄; (它總是顯示jdk9)

+0

這將如何幫助? – Sunil

0

我面臨同樣的問題。但是當我檢查我的筆記本電腦的Java版本,它是9. 我剛剛更改爲Java 8,發現每件事情都很好。

只需檢查此解決方案。希望它能起作用,如果你得到完全相同的錯誤作爲這個線程的開始。

  • Binod網速慢
相關問題