因此,根据文档,读取工作正常:
val cql = new org.apache.spark.sql.cassandra.CassandraSQLContext(sc)
cql.setConf("cluster-src/spark.cassandra.connection.host", "1.1.1.1")
cql.setConf("cluster-dst/spark.cassandra.connection.host", "2.2.2.2")
...
var df = cql.read.format("org.apache.spark.sql.cassandra")
.option("table", "my_table")
.option("keyspace", "my_keyspace")
.option("cluster", "cluster-src")
.load()
但目前尚不清楚如何将目标群集名称传递给保存对应项。这显然不起作用,它只是尝试连接到本地spark主机:
df.write
.format("org.apache.spark.sql.cassandra")
.option("table", "my_table")
.option("keyspace", "my_keyspace")
.option("cluster", "cluster-dst")
.save()
更新
找到了解决方法,但它有点难看。所以而不是:
.option("cluster", "cluster-dst")
使用:
.option("spark_cassandra_connection_host", cql.getConf("cluster-dst/spark.cassandra.connection.host")