2017-09-26 261 views
0

我在Scala 2.11.8的CDH 5.10集羣上使用Spark 2.2。一切工作正常,但我突然開始得到這個驅動程序代碼:Spark Streaming中的java.lang.LinkageError

 Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method 
    "org.apache.spark.streaming.StreamingContext$.getOrCreate(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;" 
    the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, com/hp/hawkeye/driver/StreamingDriver$, 
    and the class loader (instance of sun/misc/Launcher$AppClassLoader) 
for the method's defining class, org/apache/spark/streaming/StreamingContext$, 
have different Class objects for the type scala/Function0 used in the signature 

任何想法我可以解決這個問題?

回答

0

找出解決方案 - 有一個類加載器衝突,這是因爲在集羣上手動放置依賴關係jar。這些幫助:

RM -rf〜/名.bst RM -rf〜/ .ivy2 /緩存

然後重新IDEA。星團上的Spark提交很好。但是在lib中放置一個額外的依賴jar(spark-avro-assembly-4.0.0-snapshot)帶來了這個問題。不知怎的,那個用spark-avro 3.2修復Spark 2.2的jar會產生這個問題。