Cassandra Spark Connector - NoSuchMethodError:scala.runtime.ObjectRef.zero()Lscala / runtime / ObjectRef

时间:2017-02-04 05:16:39

标签: java scala apache-spark spark-cassandra-connector

  

我正在尝试将spark与cassandra数据库连接,但我收到了下面提到的错误。我认为应该与版本不匹配。

代码:

    SparkConf conf = new SparkConf().setAppName("kafka-sandbox").setMaster("local[2]");
    conf.set("spark.cassandra.connection.host", "192.168.34.1");//connection for cassandra database
    JavaSparkContext sc = new JavaSparkContext(conf);
    CassandraConnector connector = CassandraConnector.apply(sc.getConf());
    final Session session = connector.openSession();//error in this line
    final PreparedStatement prepared = session.prepare("INSERT INTO spark_test5.messages JSON?");
error:


    Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef;
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:82)
pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkPoc</groupId>
  <artifactId>Spark-Poc</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-8_2.10</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.0.1</version>
    </dependency> 
  </dependencies>
<build>
    <plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
            <source>1.8</source>
            <target>1.8</target>
        </configuration>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>2.4.1</version>
        <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
            <archive>
                    <manifest>
                            <mainClass>com.nwf.Consumer</mainClass>
                    </manifest>
            </archive>
        </configuration>
        <executions>
            <execution>
                    <id>make-assembly</id>
                    <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                            <goal>single</goal>
                    </goals>
            </execution>
    </executions>
    </plugin>
    </plugins>
</build>
</project>

Spark版本:版本2.0.0

Scala版本:版本2.11.8

3 个答案:

答案 0 :(得分:0)

在scala 2.11中引入了scala.runtime.VolatileObjectRef上的

零()你可能有一个针对Scala 2.11编译并在Scala 2.10运行时运行的库。

v2.10:https://github.com/scala/scala/blob/2.10.x/src/library/scala/runtime/VolatileObjectRef.java v2.11:https://github.com/scala/scala/blob/2.11.x/src/library/scala/runtime/VolatileObjectRef.java

答案 1 :(得分:0)

根据您的pom.xml混合scala版本,了解不同的依赖项:

  • 火花streaming_的 2.10
  • 火花core_的 2.10
  • 火花流-卡夫卡0-8_的 2.10
  • 火花卡桑德拉-connector_的 2.11
  • 火花SQL_的 2.11

所有依赖项都应具有相同的scala版本。请尝试将所有内容更改为_2.11

答案 2 :(得分:0)

In my pom.xml I changed sacala version from 2.10 to 2.11.
Given below is the updated pom.xml


----------
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkPoc</groupId>
  <artifactId>Spark-Poc</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.0.1</version>
    </dependency> 
  </dependencies>
<build>
    <plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
            <source>1.8</source>
            <target>1.8</target>
        </configuration>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>2.4.1</version>
        <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
            <archive>
                    <manifest>
                            <mainClass>com.nwf.Consumer</mainClass>
                    </manifest>
            </archive>
        </configuration>
        <executions>
            <execution>
                    <id>make-assembly</id>
                    <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                            <goal>single</goal>
                    </goals>
            </execution>
    </executions>
    </plugin>
    </plugins>
</build>
</project>