2016-01-20 67 views
2

我試圖從Spark 1.3連接到Cassandra 3.0。我知道每個版本的火花都有Cassandra連接器,但連接器取決於cassandra-driver-core:2.1.5,這就是爲什麼我使用最新的cassandra連接器取決於最新的核心驅動程序。無論如何,迄今爲止這不是問題。問題是我想的com.google.guava包。Spark 1.3和Cassandra 3.0番石榴問題

我POM看起來是這樣的:

... 
<dependency> 
    <groupId>com.datastax.spark</groupId> 
    <artifactId>spark-cassandra-connector-java_2.10</artifactId> 
    <version>1.5.0-M3</version> 
</dependency> 
<dependency> 
    <groupId>com.datastax.spark</groupId> 
    <artifactId>spark-cassandra-connector_2.10</artifactId> 
    <version>1.5.0-M3</version> 
</dependency> 
... 

我從到處都排除谷歌番石榴:

<exclusions> 
    <exclusion> 
     <groupId>com.google.guava</groupId> 
     <artifactId>guava</artifactId> 
    </exclusion> 
</exclusions> 

所以在依賴關係樹僅此存在com.datastax.spark:spark-cassandra-connector-java_2.10:jar:1.5.0-M3:compilecom.google.guava:guava:jar:16.0.1。 但是我仍然收到以下錯誤:

yarn.ApplicationMaster: User class threw exception: Failed to open native connection to Cassandra at {139.19.52.111}:9042 
java.io.IOException: Failed to open native connection to Cassandra at {139.19.52.111}:9042 
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162) 
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148) 
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148) 
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) 
    at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) 
    at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) 
    at com.ambiverse.tagging.dao.impl.DAOCassandra.createTable(DAOCassandra.java:45) 
    at com.ambiverse.tagging.dao.impl.DAOCassandra.createTable(DAOCassandra.java:64) 
    at com.ambiverse.tagging.dao.impl.DAOCassandra.savePairRDD(DAOCassandra.java:70) 
    at com.ambiverse.tagging.statistics.entitycorrelation.CorrelationStatisticsSparkRunner.run(CorrelationStatisticsSparkRunner.java:176) 
    at com.ambiverse.tagging.statistics.entitycorrelation.CorrelationStatisticsSparkRunner.main(CorrelationStatisticsSparkRunner.java:94) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:497) 
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480) 
Caused by: java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture; 
    at com.datastax.driver.core.Connection.initAsync(Connection.java:178) 
    at com.datastax.driver.core.Connection$Factory.open(Connection.java:742) 
    at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:240) 
    at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:187) 
    at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:79) 
    at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1393) 
    at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:402) 
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155) 

有人點我前此博客張貼的解決方案:http://arjon.es/2015/10/12/making-hadoop-2-dot-6-plus-spark-cassandra-driver-play-nice-together/,我使用Maven作爲構建工具,而不是SBT。如果你知道如何用maven做同樣的事情,那會很棒。

+0

這很奇怪,對我來說這個錯誤似乎表明你必須在classpath中有一個比14.0更早的番石榴版本,就像引入了Fallback時一樣。另一方面,我們已經在datastax java-driver中添加了一些驗證,以便在檢測到舊版本番石榴時拋出異常... –

+0

順便說一句,如果我從IDEA啓動Spark,如何解決它? – szu

回答

1

雖然我使用scala + sbt,但是我在火花的不同工件之間存在幾個不匹配,其中一個是番石榴。

這是我如何解決它(在SBT依賴):

val sparkVersion   = "1.6.1"//"2.0.0-preview"// 
    val sparkCassandraConnectorVersion = "1.6.0" 

    val scalaGuiceVersion = "4.0.1" 

    val cassandraUnitVersion = "3.0.0.1" 


    val typesafeConfigVersion = "1.3.0" 

    val findbugsVersion = "3.0.0" 

    val sparkRabbitmqVersion = "0.4.0.20160613" 
    val nettyAllVersion = "4.0.33.Final" 
    val guavaVersion = "19.0" 
    val jacksonVersion = "2.7.4" 
    val xbeanAsm5ShadedVersion = "4.5" 
    val commonsBeanutilsVersion = "1.8.0" 


    //IMPORTANT: all spark dependency magic is done in one place, to overcome the assembly mismatch errors 
    val sparkDependencies :List[ModuleID] = List(
    ("org.apache.spark" %% "spark-core" % sparkVersion).exclude("com.esotericsoftware.minlog", "minlog"), 
    "org.apache.spark" %% "spark-sql" % sparkVersion, 
    "org.apache.spark" %% "spark-streaming" % sparkVersion, 
    ("com.datastax.spark" %% "spark-cassandra-connector" 
     % sparkCassandraConnectorVersion).exclude("org.apache.cassandra", "cassandra-clientutil"), 
    "com.stratio.receiver" % "spark-rabbitmq_1.6" % sparkRabbitmqVersion,//"0.3.0-b", //,// 
    "org.scalatest" %% "scalatest" % scalaTestVersion % "test", 

    "org.apache.xbean" % "xbean-asm5-shaded" % xbeanAsm5ShadedVersion,//,//, //https://github.com/apache/spark/pull/9512/files 

    "io.netty" % "netty-all" % nettyAllVersion, 
    "commons-beanutils" % "commons-beanutils" % commonsBeanutilsVersion, 
    "com.google.guava" % "guava" % guavaVersion, 

    "com.fasterxml.jackson.module" %% "jackson-module-scala" % jacksonVersion,//fix jackson mismatch problem 
    "com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion,//fix jackson mismatch problem 


    //override findbugs artifacts versions(fix assembly issues) 
    "com.google.code.findbugs" % "annotations" % findbugsVersion, 
    "com.google.code.findbugs" % "jsr305" % findbugsVersion 
).map(_.exclude("commons-collections", "commons-collections")) 

我希望這將有助於。