无法使用DataStax Spark Cassandra Connector将Spark Dataframe存储到Cassandra

时间:2017-06-07 12:37:33

标签: scala apache-spark cassandra sbt datastax

我正在尝试使用datastax spark cassandra连接器将spark数据帧写入cassandra表。我的scala版本是:2.11.8,依赖关系如下:

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.10" % "1.6.2"

libraryDependencies += "com.datastax.cassandra" % "dse-driver" % "1.1.2"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.1"

我使用以下代码将数据框存储到表中:

finaldf.write.format("org.apache.spark.sql.cassandra").options(Map("table" -> "data","keyspace" -> "test")).save()

但我一直收到以下错误:

com.datastax.driver.core.DefaultResultSetFuture cannot be cast to shade.com.datastax.spark.connector.google.common.util.concurrent.ListenableFuture

任何帮助将不胜感激。提前致谢。

0 个答案:

没有答案