使用spark scala从cassandra表查询时创建数据帧时出错

时间:2017-11-30 08:16:16

标签: scala spark-dataframe spark-cassandra-connector

  val sqlContext = new org.apache.spark.sql.SQLContext(sc)
  import sqlContext.implicits._
  val df =Seq(36,445).toDF()

上面的片段,用于创建数据帧会引发以下异常:

Caused by: `com.datastax.driver.core.exceptions.UnavailableException:` Not enough                                                                                                       replicas available for query at consistency LOCAL_ONE (1 required but only 0 ali                                                                                                      ve)
        at com.datastax.driver.core.Responses$Error$1.decode(Responses.java:42)                                                                                                       ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Responses$Error$1.decode(Responses.java:29)                                                                                                       ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Message$ProtocolDecoder.decode(Message.java:                                                                                                      284) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Message$ProtocolDecoder.decode(Message.java:                                                                                                      264) ~[dse-java-driver-core-1.2.2.jar:na]
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM                                                                                                      essageDecoder.java:88) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        ... 18 common frames omitted
ERROR 2017-11-30 01:52:04,070 org.apache.spark.scheduler.LiveListenerBus: SparkL                                                                                                      istenerBus has already stopped! Dropping event SparkListenerExecutorAdded(151202                                                                                                      4724063,0,org.apache.spark.scheduler.cluster.ExecutorData@f32eba1e)
ERROR 2017-11-30 01:52:04,131 org.apache.spark.scheduler.LiveListenerBus: SparkL                                                                                                      istenerBus has already stopped! Dropping event SparkListenerBlockManagerAdded(15                                                                                                      12024724131,BlockManagerId(0, localhost, 45074),384093388)

1 个答案:

答案 0 :(得分:0)

在创建数据框之前添加此行 sc.getConf.set("spark.cassandra.input.consistency.level","ANY")

在这里你可以检查其他属性...... https://docs.datastax.com/en/datastax_enterprise/4.8/datastax_enterprise/spark/sparkCassProps.html