2017-08-21 64 views
0

我在我的本地計算機上使用cassandra的datastax 5.1版本。開始使用卡桑德拉NoHostAvailableException在運行時帶有dse的火花

dse cassandra -k 

卡桑德拉啓動罰款。接下來我想去火花外殼使用

dse spark 

但是,它給了我以下錯誤。

2017-08-21 12:11:25 [main] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to start or submit Spark application because of com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) - see details in the log file(s): /home/rsahukar/.spark-shell.log 
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) 
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:75) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:28) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232) ~[dse-java-driver-core-1.2.2.jar:na] 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131] 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131] 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131] 
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131] 
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.sun.proxy.$Proxy6.execute(Unknown Source) ~[na:na] 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131] 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131] 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131] 
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131] 
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.sun.proxy.$Proxy7.execute(Unknown Source) ~[na:na] 
    at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42) ~[dse-core-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:54) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2] 
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:112) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:44) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2] 
    at scala.util.Try$.apply(Try.scala:192) ~[scala-library-2.11.11.jar:na] 
    at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26) ~[dse-spark-5.1.2.jar:5.1.2] 
    at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25) ~[dse-spark-5.1.2.jar:5.1.2] 
    at com.datastax.bdp.util.Lazy.get(Lazy.scala:31) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:152) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:151) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:79) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:68) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.2.jar:5.1.2] 
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) 
    at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:204) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler.access$1000(RequestHandler.java:40) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.findNextHostAndQuery(RequestHandler.java:268) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:108) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:88) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:124) ~[dse-java-driver-core-1.2.2.jar:na] 
    ... 43 common frames omitted 
2017-08-21 12:11:25 [Thread-1] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to cancel delegation token 

下面是dsetool環輸出

$ dsetool ring 
Address   DC     Rack   Workload    Graph Status State Load    Owns     Token          Health [0,1] 
127.0.0.1  Analytics   rack1  Analytics(SM)  no  Up  Normal 189.19 KiB  ?     5643405743002698980       0.50   

有人能幫助我嗎?

+0

檢查cassandra是否以'nodetool status'或'dsetool ring'啓動 –

+0

它已啓動。將dsetool環輸出添加到問題中 – rahul

回答

0

最後我發現我的錯誤。我以本地模式運行cassandra。這是我的火花的conf文件(火花defaults.conf)的變化

.... 
spark.cassandra.connection.local_dc  localhost 
spark.cassandra.connection.host   localhost 
.... 

請注意spark.cassandra.connection.local_dc前值。因爲我以本地模式運行它,所以我認爲它的值也應該是localhost。 但是,它應該是DC名稱dsetool環返回。

下面是我dsetool環輸出

$ dsetool ring 
Address   DC     Rack   Workload    Graph Status State Load    Owns     Token          Health [0,1] 
127.0.0.1  Analytics   rack1  Analytics(SM)  no  Up  Normal 189.19 KiB  ?     5643405743002698980       0.50   

我們可以在上面看到,該DC值分析。所以,必須將相同的值放在spark conf文件中。以下是更改後的代碼

spark.cassandra.connection.local_dc  Analytics 
spark.cassandra.connection.host   localhost