通过Spark查询时,Cassandra连接已关闭错误

时间:2016-12-07 19:56:49

标签: java apache-spark cassandra

我尝试使用Java中的Spark访问远程Cassandra。但是,当我尝试执行聚合函数(count)时,出现以下错误:

Exception in thread "main" com.datastax.driver.core.exceptions.TransportException: [/192.168.1.103:9042] Connection has been closed
    at com.datastax.driver.core.exceptions.TransportException.copy(TransportException.java:38)
    at com.datastax.driver.core.exceptions.TransportException.copy(TransportException.java:24)
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:37)
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:245)

我已经将Cassandra.yml中的超时设置为大值。 这是我的代码:

SparkConf conf = new SparkConf();
conf.setAppName("Test");
conf.setMaster("local[*]");
conf.set("spark.cassandra.connection.host", "host");
Spark app = new Spark(conf);
app.run();
.
.
.
CassandraConnector connector = CassandraConnector.apply(sc.getConf());
// Prepare the schema
try (Session session = connector.openSession()) {
session.execute("USE keyspace0");
ResultSet results = session.execute("SELECT count(*) FROM table0");

0 个答案:

没有答案