DSE 6.7 Alwayson SQL

时间:2019-12-26 17:23:58

标签: datastax-enterprise alwayson

我已经通过dse.yaml文件启用了Alwayson-SQL。当我输入以下命令“ dse client-tool alwayson-sql start”时,我从system.log中获取followig错误

WARN  2019-12-26 12:22:33,606 org.apache.spark.util.Utils: Your hostname, ubuntu1 resolves to a loopback address: 127.0.1.1; using 192.168.93.124 instead (on interface enp0s3)
WARN  2019-12-26 12:22:33,608 org.apache.spark.util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
WARN  2019-12-26 12:22:37,415 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 2 seconds...
WARN  2019-12-26 12:22:40,649 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 4 seconds...
ERROR 2019-12-26 12:22:44,919 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application because of java.io.IOException: Failed to fetch dynamic configuration from DSE
java.io.IOException: Failed to fetch dynamic configuration from DSE
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:85)
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:83)
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:83)
    at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:45)
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$dynamicConfiguration$2.apply(SparkConfigurator.scala:100)
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$dynamicConfiguration$2.apply(SparkConfigurator.scala:99)
    at scala.util.Try$.apply(Try.scala:192)
    at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26)
    at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25)
    at com.datastax.bdp.util.Lazy.get(Lazy.scala:31)
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:176)
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:175)
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:147)
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:147)
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:86)
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:75)
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:93)
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala)
Caused by: java.io.IOException: Failed to open native connection to Cassandra at {192.168.14.2, 192.168.14.3}:9042
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:184)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$10.apply(CassandraConnector.scala:167)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$10.apply(CassandraConnector.scala:167)
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
    at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
    at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
    at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:114)
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:53)
    ... 17 common frames omitted
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /192.168.14.2:9042 (com.datastax.driver.core.exceptions.TransportException: [/192.168.14.2:9042] Cannot connect), /192.168.14.3:9042 (com.datastax.driver.core.exceptions.TransportException: [/192.168.14.3:9042] Cannot connect))
    at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:259)
    at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:98)
    at com.datastax.driver.core.Cluster$Manager.negotiateProtocolVersionAndConnect(Cluster.java:1687)
    at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1606)
    at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:453)
    at com.datastax.driver.core.DelegatingCluster.getMetadata(DelegatingCluster.java:89)
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:174)
    ... 25 common frames omitted
ERROR 2019-12-26 12:22:45,022 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to cancel delegation token

0 个答案:

没有答案