Spark-Cassandra连接器始终默认为127.0.1.1

时间:2017-05-02 13:34:55

标签: apache-spark cassandra spark-cassandra-connector

Cassandra的Spark连接器试图连接到127.0.1.1:9042,即使我正在硬编码该地址。 甚至硬编码地址 conf.set(“cassandra.connection.host”,“37.61.205.66”),不起作用。我不希望Cassandra CQL端口在127.0.1.1上运行。有什么解决方案。

的pom.xml:

array_merge(): Argument #1 is not an array search

错误:

<dependencies>
        <!-- Scala and Spark dependencies -->
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>1.5.0-RC1</version>
        </dependency>
                <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
            <version>1.5.0-RC1</version>
        </dependency>
                <dependency>
            <groupId>com.datastax.cassandra</groupId>
            <artifactId>cassandra-driver-core</artifactId>
            <version>3.0.0-rc1</version>
        </dependency>
        <dependency>

1 个答案:

答案 0 :(得分:1)

正确的设置以'spark'为前缀。请参阅docs

conf.set("spark.cassandra.connection.host", cassandraHost)