datastax - 火花壳启动错误

时间:2017-05-09 21:19:13

标签: apache-spark cassandra datastax datastax-enterprise

我使用datastax企业版在几个节点上启用了火花。启用后,我重新启动了dse服务&以下是我的nodetool配置,

dsetool状态:

[user@server ~]$ dsetool ring
Address          DC     Rack    Workload        Graph  Status  State    Load             Owns     VNodes    Health [0,1]
192.168.1.130    dc1    rack1   Analytics(SM)   no     Up      Normal   666.47 MiB       ?        128        0.00
192.168.1.131    dc1    rack1   Analytics(SW)   no     Up      Normal   672.09 MiB       ?        128        0.00
192.168.1.132    dc1    rack1   Search          no     Up      Normal   658.48 MiB       ?        128        0.90

当我尝试启动spark shell时,我收到以下错误...

The log file is at /root/.spark-shell.log
WARN  2017-05-09 14:09:15,215 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 1 seconds...
WARN  2017-05-09 14:09:18,459 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 2 seconds...
WARN  2017-05-09 14:09:22,698 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 4 seconds...
WARN  2017-05-09 14:09:28,941 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 8 seconds...
WARN  2017-05-09 14:09:39,234 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 16 seconds...
ERROR 2017-05-09 14:09:57,476 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application
WARN  2017-05-09 14:09:59,869 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 1 seconds...
WARN  2017-05-09 14:10:03,099 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 2 seconds...
WARN  2017-05-09 14:10:07,346 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 4 seconds...
WARN  2017-05-09 14:10:13,678 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 8 seconds...
WARN  2017-05-09 14:10:23,913 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 16 seconds...
ERROR 2017-05-09 14:10:42,247 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to cancel delegation token

日志文件中的异常:

2017-05-09 16:10:49 [main] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to start or submit Spark application
java.io.IOException: Failed to fetch dynamic configuration from DSE
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:86) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:43) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.x$4$lzycompute(SparkConfigurator.scala:85) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.x$4(SparkConfigurator.scala:71) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.sparkNodeConfiguration$lzycompute(SparkConfigurator.scala:71) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.sparkNodeConfiguration(SparkConfigurator.scala:71) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:180) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:149) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:124) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:124) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:79) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:68) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:67) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.0.jar:5.1.0]
Caused by: java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.132}:9042
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:168) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:51) ~[dse-spark-5.1.0.jar:5.1.0]
        ... 18 common frames omitted
Caused by: com.datastax.driver.core.exceptions.AuthenticationException: Authentication error on host /192.168.1.132:9042: Host /192.168.1.132:9042 requires authentication, but no authenticator found in Cluster configuration
        at com.datastax.driver.core.AuthProvider$1.newAuthenticator(AuthProvider.java:31) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$5.apply(Connection.java:248) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$5.apply(Connection.java:233) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:906) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.Futures$1$1.run(Futures.java:635) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.Futures$1.run(Futures.java:632) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:457) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.AbstractFuture.set(AbstractFuture.java:185) ~[guava-18.0.jar:na]
        at com.datastax.driver.core.Connection$Future.onSet(Connection.java:1293) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1074) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991) ~[dse-java-driver-core-1.2.2.jar:na]
        at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:934) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:405) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:310) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at java.lang.Thread.run(Unknown Source) ~[na:1.8.0_121]

2 个答案:

答案 0 :(得分:2)

感谢您的所有回复。我已经想到了这个&我可以通过发出以下命令来启动dse spark shell,

sudo dse -u <cassandra_username> -p <cassandra_password> spark

这是因为我在cassandra集群上启用了内部身份验证。

注意:我的设置是使用Datastax Enterprise二进制文件完成的。如果你已经安装了apache cassandra&amp; apache火花分开。

答案 1 :(得分:1)

DSE存在通信错误。这可能是因为您的DC是混合工作负载,这是不受支持的。将所有节点设置为Analytics,将所有节点设置为Search,或将所有节点设置为Search And Analytics。