2015-11-19 119 views
13

這有點奇怪。當運行一個非常簡單的sparkContext.parallelize(List("1","2","3"))Spark com.fasterxml.jackson.module錯誤

,我發現了以下錯誤:

java.lang.VerifyError: class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer overrides final method withResolved.(Lcom/fasterxml/jackson/databind/BeanProperty;Lcom/fasterxml/jackson/databind/jsontype/TypeSerializer;Lcom/fasterxml/jackson/databind/JsonSerializer;)Lcom/fasterxml/jackson/databind/ser/std/AsArraySerializerBase; 

我想有一個的一些庫的依賴,有些衝突。我的build.sbt看起來像這樣:

scalaVersion := "2.11.7" 

//Library repositories 
resolvers ++= Seq(
    Resolver.mavenLocal, 
    "Scala-Tools Maven2 Repository" at "http://scala-tools.org/repo-releases", 
    "Java.net repository" at "http://download.java.net/maven/2", 
    "GeoTools" at "http://download.osgeo.org/webdav/geotools", 
    "Apache" at "https://repository.apache.org/service/local/repositories/releases/content", 
    "Cloudera" at "https://repository.cloudera.com/artifactory/cloudera-repos/", 
    "OpenGeo Maven Repository" at "http://repo.opengeo.org", 
    "Typesafe" at "https://repo.typesafe.com/typesafe/releases/", 
    "Spray Repository" at "http://repo.spray.io" 
) 

//Library versions 
val geotools_version = "13.2" 
val accumulo_version = "1.6.0-cdh5.1.4" 
val hadoop_version = "2.6.0-cdh5.4.5" 
val hadoop_client_version = "2.6.0-mr1-cdh5.4.5" 
val geowave_version = "0.9.0-SNAPSHOT" 
val akka_version = "2.4.0" 
val spray_version = "1.3.3" 
val spark_version = "1.5.0" 

//Library Dependencies 
libraryDependencies ++= Seq(
    //Scala 
    "org.scala-lang" % "scala-library" % scalaVersion.value, 
    "org.scala-lang" % "scala-reflect" % scalaVersion.value, 

    //GeoTools 
    "org.geotools" % "gt-data" % geotools_version, 
    "org.geotools" % "gt-geojson" % geotools_version, 

    //Apache 
    "org.apache.accumulo" % "accumulo-core" % accumulo_version, 
    "org.apache.hadoop" % "hadoop-common" % hadoop_version, 
    "org.apache.hadoop" % "hadoop-client" % hadoop_client_version, 

    //Geowave 
    "mil.nga.giat" % "geowave-core-store" % geowave_version, 
    "mil.nga.giat" % "geowave-datastore-accumulo" % geowave_version, 
    "mil.nga.giat" % "geowave-adapter-vector" % geowave_version, 

    //Other 
    "com.typesafe" % "config" % "1.3.0", 

    //Spray - Akka 
    "com.typesafe.akka" %% "akka-actor" % akka_version, 

    "io.spray" %% "spray-can" % spray_version, 
    "io.spray" %% "spray-routing" % spray_version, 
    "io.spray" %% "spray-testkit" % spray_version % "test", 

    //Spark 
    "org.apache.spark" %% "spark-core" % spark_version, 

    "com.typesafe.play" %% "play-json" % "2.5.0-M1", 

    //Testing 
    "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test" 
).map(
    _.excludeAll(ExclusionRule(organization = "org.mortbay.jetty")) 
) 

test in assembly := {} 

任何指向哪裏看?

感謝

回答

10

實際上,它是使用不同版本的數據傑克遜綁定的依賴性衝突,由於JSON播放和火花。此build.sbt似乎可以解決問題:

scalaVersion := "2.11.7" 

//Library repositories 
resolvers ++= Seq(
    Resolver.mavenLocal, 
    "Scala-Tools Maven2 Repository" at "http://scala-tools.org/repo-releases", 
    "Java.net repository" at "http://download.java.net/maven/2", 
    "GeoTools" at "http://download.osgeo.org/webdav/geotools", 
    "Apache" at "https://repository.apache.org/service/local/repositories/releases/content", 
    "Cloudera" at "https://repository.cloudera.com/artifactory/cloudera-repos/", 
    "OpenGeo Maven Repository" at "http://repo.opengeo.org", 
    "Typesafe" at "https://repo.typesafe.com/typesafe/releases/", 
    "Spray Repository" at "http://repo.spray.io" 
) 

//Library versions 
val geotools_version = "13.2" 
val accumulo_version = "1.6.0-cdh5.1.4" 
val hadoop_version = "2.6.0-cdh5.4.5" 
val hadoop_client_version = "2.6.0-mr1-cdh5.4.5" 
val geowave_version = "0.9.0-SNAPSHOT" 
val akka_version = "2.4.0" 
val spray_version = "1.3.3" 
val spark_version = "1.5.2" 

//Library Dependencies 
libraryDependencies ++= Seq(
    //Scala 
    "org.scala-lang" % "scala-library" % scalaVersion.value, 
    "org.scala-lang" % "scala-reflect" % scalaVersion.value, 

    //GeoTools 
    "org.geotools" % "gt-data" % geotools_version, 
    "org.geotools" % "gt-geojson" % geotools_version, 

    /** ********************************************** PROVIDED ****************************************/ 
    // Apache 
    // "org.apache.accumulo" % "accumulo-core" % accumulo_version % "provided", 
    // "org.apache.hadoop" % "hadoop-common" % hadoop_version% "provided", 
    // "org.apache.hadoop" % "hadoop-client" % hadoop_client_version% "provided", 
    // 
    // //Geowave 
    // "mil.nga.giat" % "geowave-core-store" % geowave_version % "provided", 
    // "mil.nga.giat" % "geowave-datastore-accumulo" % geowave_version % "provided", 
    // "mil.nga.giat" % "geowave-adapter-vector" % geowave_version % "provided", 

    /** **********************************************************************************************/ 
    //Apache 
    "org.apache.accumulo" % "accumulo-core" % accumulo_version, 
    "org.apache.hadoop" % "hadoop-common" % hadoop_version, 
    "org.apache.hadoop" % "hadoop-client" % hadoop_client_version, 

    //Geowave 
    "mil.nga.giat" % "geowave-core-store" % geowave_version, 
    "mil.nga.giat" % "geowave-datastore-accumulo" % geowave_version, 
    "mil.nga.giat" % "geowave-adapter-vector" % geowave_version, 

    //Other 
    "com.typesafe" % "config" % "1.3.0", 

    //Spray - Akka 
    "com.typesafe.akka" %% "akka-actor" % akka_version, 

    "io.spray" %% "spray-can" % spray_version, 
    "io.spray" %% "spray-routing" % spray_version, 
    "io.spray" %% "spray-testkit" % spray_version % "test", 
    "com.typesafe.play" %% "play-json" % "2.5.0-M1" 
    exclude("com.fasterxml.jackson.core", "jackson-databind"), 

    //Spark 
    "org.apache.spark" %% "spark-core" % spark_version, 

    //Testing 
    "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test" 
).map(
    _.excludeAll(
    ExclusionRule(organization = "org.mortbay.jetty") 
) 
) 

test in assembly := {} 
+0

任何機會,以解決這使用maven而不是sbt? – astiefel

+1

不要認爲會有很大的不同。在這方面,依賴分辨率非常相似。 –

+0

是的,我只是不熟悉sbt這就是全部 – astiefel

0

在同一個項目中使用Spring Boot和Spark時可能會遇到同樣的問題。以防萬一 - 從除Spark之外的其他項目中排除jackson-databind依賴項。從所需的任何其他依賴

<dependency> 
     <groupId>com.fasterxml.jackson.core</groupId> 
     <artifactId>jackson-databind</artifactId> 
     <version>2.4.4</version> 
    </dependency> 

沒有排除:

+0

我正在使用Spring-Boot v1.3.1和Spark v1.6.0,並且出現了關於'com.fasterxml.jackson.core:jackson-databind:2.6.4'的衝突。顯然,'org.springframework.boot:spring-boot-starter-web:1.3.1.RELEASE'需要v2.6.4,並且Spark需要v2.4.4。我必須做的是排除v2.6.4,然後明確依賴於v.2.4.4。似乎目前正在工作。 –

19

春季啓動1.3.1 + Apache的星火1.6.0問題是由添加的依賴解決。

+1

+1它的工作原理:-)。您可能想要爲org.codehaus.jackson:jackson-mapper-asl放入和排除,因爲現在classpath中有2個版本的AsArraySerializerBase現在 –

+0

謝謝解決了我的問題! +1。如果有人需要,我也發佈了相當於Gradle的答案 – gudthing

8

只需在SBT中添加這些行即可。

dependencyOverrides ++= Set(
    "com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4" 
) 
+1

這對Spark 1.6.1和play-json無效2.4.8 –

+1

嘗試更改jackson-databind的版本。將其設置爲與您的play-json版本相同(例如:目前您使用的是2.4.4,因此請將其更改爲2.4.8)。 –

0

我也有同樣的問題,njjnex解決方案爲我工作!我使用Spring Boot 1.3.3 + Spark 1.6.1和Gradle 2.9作爲構建工具。下面是搖籃用戶的解決方案:

compile group: 'com.fasterxml.jackson.core', name: 'jackson-databind', version: '2.4.4' 

如果遇到問題,可以嘗試不同的版本,你可以找到here(適用於Maven和SBT用戶)