使用Java Spark Driver,Cassandra“Connection已关闭”

时间:2016-12-24 20:57:10

标签: java apache-spark cassandra cql

有时当我尝试使用Java中的Spark连接到Cassandra节点时,会抛出以下异常:

Connection has been closed
    at com.datastax.driver.core.exceptions.TransportException.copy(TransportException.java:38)
    at com.datastax.driver.core.exceptions.TransportException.copy(TransportException.java:24)
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:37)
    at com.datastax.driver.core.ArrayBackedResultSet$MultiPage.prepareNextRow(ArrayBackedResultSet.java:313)
    at com.datastax.driver.core.ArrayBackedResultSet$MultiPage.isExhausted(ArrayBackedResultSet.java:269)
    at com.datastax.driver.core.ArrayBackedResultSet$1.hasNext(ArrayBackedResultSet.java:143)
    at database.Spark.prepareMaps(Spark.java:97)
    at database.Spark.main(Spark.java:755)

重新运行代码,没有任何更改,一切正常。 这是我写的代码:

javaSparkContext = new JavaSparkContext(sparkConf);
cassandraConnector = CassandraConnector.apply(javaSparkContext.getConf());
        try {
            session = cassandraConnector.openSession();
            LOGGER.info("Connected successfully to Cassandra");
        } catch (

        Exception exception) {
            LOGGER.error("Couldn't connect to Cassandra or error in query");
            LOGGER.error(exception);
        }

0 个答案:

没有答案