Spark-Saving JavaRDD到Cassandra

时间:2014-12-05 19:40:11

标签: cassandra apache-spark rdd

http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java

上面的链接显示了以这种方式将JavaRDD保存到cassandra的方法:

import static com.datastax.spark.connector.CassandraJavaUtil.*;

JavaRDD<Product> productsRDD = sc.parallelize(products);
javaFunctions(productsRDD, Product.class).saveToCassandra("java_api", "products");

com.datastax.spark.connector.CassandraJavaUtil.*似乎已被弃用。更新的API应为:

import static com.datastax.spark.connector.japi.CassandraJavaUtil.*;

有人可以使用上面更新的API向我显示一些代码,以便将JavaRDD存储到Cassandra吗?

2 个答案:

答案 0 :(得分:6)

documentation之后,应该是这样的:

javaFunctions(rdd).writerBuilder("ks", "people", mapToRow(Person.class)).saveToCassandra();

答案 1 :(得分:1)

替换

JavaRDD<Product> productsRDD = sc.parallelize(products); javaFunctions(productsRDD, Product.class).saveToCassandra("java_api", "products »);

通过

JavaRDD<Product> productsRDD = javaFunctions(sc).cassandraTable("java_api", "products", mapRowTo(Product.class));