使用dse运行spark时出现NoHostAvailableException

时间:2017-08-21 06:51:37

标签: cassandra datastax datastax-enterprise

我在本地计算机上使用caastand的datastax 5.1版本。使用

开始cassandra
dse cassandra -k

Cassandra很好。接下来我想用

去火花壳
dse spark

然而,它给我以下错误。

2017-08-21 12:11:25 [main] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to start or submit Spark application because of com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) - see details in the log file(s): /home/rsahukar/.spark-shell.log
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:75) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:28) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232) ~[dse-java-driver-core-1.2.2.jar:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.sun.proxy.$Proxy6.execute(Unknown Source) ~[na:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.sun.proxy.$Proxy7.execute(Unknown Source) ~[na:na]
    at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42) ~[dse-core-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:54) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2]
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:112) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:44) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2]
    at scala.util.Try$.apply(Try.scala:192) ~[scala-library-2.11.11.jar:na]
    at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26) ~[dse-spark-5.1.2.jar:5.1.2]
    at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25) ~[dse-spark-5.1.2.jar:5.1.2]
    at com.datastax.bdp.util.Lazy.get(Lazy.scala:31) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:152) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:151) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:79) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:68) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.2.jar:5.1.2]
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
    at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:204) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler.access$1000(RequestHandler.java:40) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.findNextHostAndQuery(RequestHandler.java:268) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:108) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:88) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:124) ~[dse-java-driver-core-1.2.2.jar:na]
    ... 43 common frames omitted
2017-08-21 12:11:25 [Thread-1] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to cancel delegation token

下面是dsetool环输出

$ dsetool ring
Address          DC                   Rack         Workload             Graph  Status  State    Load             Owns                 Token                                        Health [0,1] 
127.0.0.1        Analytics            rack1        Analytics(SM)        no     Up      Normal   189.19 KiB       ?                    5643405743002698980                          0.50         

有人可以帮助我吗?

1 个答案:

答案 0 :(得分:0)

最后我发现了自己的错误。我在本地模式下运行cassandra。这是我在改变之前的spark conf文件(spark-defaults.conf)

....
spark.cassandra.connection.local_dc     localhost
spark.cassandra.connection.host         localhost
....

请注意spark.cassandra.connection.local_dc值。因为我在本地模式下运行它,我想,它的值也应该是localhost。 但是,它应该是 dsetool ring 返回的DC名称。

下面是我的dsetool铃声输出

$ dsetool ring
Address          DC                   Rack         Workload             Graph  Status  State    Load             Owns                 Token                                        Health [0,1] 
127.0.0.1        Analytics            rack1        Analytics(SM)        no     Up      Normal   189.19 KiB       ?                    5643405743002698980                          0.50         

如上所述,DC值为 Analytics 。所以,不得不在spark conf文件中加入相同的值。以下是更改后的代码

spark.cassandra.connection.local_dc     Analytics
spark.cassandra.connection.host         localhost