2017-08-25 70 views
1

我使用SBT使用Scala的火花試圖流Twitter的數據,一切都順利,但我有一個問題:流星火異常「主」

這是我buld.sbt:

import Assembly._ 
import AssemblyPlugin._ 

name := "TwitterSparkStreaming" 
version := "0.1" 
scalaVersion := "2.12.3" 

libraryDependencies ++= Seq(
    "org.apache.spark" % "spark-core_2.11" % "1.5.2", 
    "org.apache.spark" % "spark-sql_2.11" % "1.5.2", 
    "org.apache.spark" % "spark-streaming_2.11" % "1.5.2", 
    "org.apache.spark" % "spark-streaming-twitter_2.11" % "1.6.3", 
    "joda-time" %% "joda-time" % "2.9.1", 
    "org.twitter4j" % "twitter4j-core" % "3.0.3", 
    "org.twitter4j" % "twitter4j-stream" % "3.0.3", 
    "edu.stanford.nlp" % "stanford-corenlp" % "3.5.2", 
    "edu.stanford.nlp" % "stanford-corenlp" % "3.5.2" classifier "models" 
) 

resolvers += "Akka Repository" at "http://repo.akka.io./releases/" 

assemblyMergeStrategy in assembly := { 
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard 
    case x => MergeStrategy.first 
} 

這是類包含org.apache.spark.Logging:

import org.apache.log4j.{Logger, Level} 
import org.apache.spark.Logging 

object LogUtils extends Logging{ 
    def setStreamingLogLevels(): Unit ={ 
    val log4jInitialized = Logger.getRootLogger.getAllAppenders.hasMoreElements 
    if(!log4jInitialized) 
    { 
     logInfo("Setting log level to [WARN] for streaming example." + " To override add a custom log4j.properties to the classpath.") 
     Logger.getRootLogger.setLevel(Level.WARN) 
    } 
    } 
} 

這是錯誤不斷出現對我來說:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.Logging.$init$(Lorg/apache/spark/Logging;)V 
    at LogUtils$.<init>(LogUtils.scala:4) 
    at LogUtils$.<clinit>(LogUtils.scala) 
    at TwitterStreaming$.main(TwitterStreaming.scala:30) 
    at TwitterStreaming.main(TwitterStreaming.scala) 

我可以知道我該如何解決它?

注:我試圖改變從2.2.0版本org.apache.spark依賴於1.5.2,但問題是相同的

+0

試試這個:HTTPS: //community.hortonworks.com/questions/58286/noclassdeffounderror-orgapachesparklogging-using-s.html –

+0

謝謝@YosiDahari,但這並不奏效,我只是用1.5.2版本重新安裝spark然後才能正常工作,非常感謝 –

回答

0

我不知道爲什麼這個代碼塊是給錯誤。但是有一個更好的方法來設置Spark中的日誌級別。

請參考鏈接https://spark.apache.org/docs/latest/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

星火對sparkContext水平的方法,所以,你可以叫,

sparkContext.setLogLevel( 「WARN」)

+0

謝謝你Ganesh,但沒有奏效。 –

+0

我只是重裝版本1.5.2的火花,然後工作,非常感謝你 –

+0

@AmaniAlFarasani沒問題,如果有用,請提高回答。 – Ganesh