如何解决Apache Spark-Apache Cassandra集成中的scala.Predef $ .augmentString错误?

时间:2018-07-12 12:23:20

标签: scala apache-spark cassandra spark-cassandra-connector

我正在尝试将Apache Spark与Apache Cassandra集成在一起。但是,甚至在尝试初始化sparkconf时,我也遇到了波纹管错误。

SparkConf sparkConf = new 

SparkConf().setMaster("local").setAppName("cassandra test");
  

线程“主”中的异常java.lang.NoSuchMethodError:   scala.Predef $ .augmentString(Ljava / lang / String;)Ljava / lang / String;在   org.apache.spark.util.Utils $。(Utils.scala:1928)在   org.apache.spark.util.Utils $。(Utils.scala)在       org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:75)     在org.apache.spark.SparkConf。(SparkConf.scala:70)在   org.apache.spark.SparkConf。(SparkConf.scala:57)

下面是我用过的pom.xml:

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.13.0-M4</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.3.1</version>
<exclusions>
    <exclusion>  <!-- declare the exclusion here -->
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
    </exclusion>
  </exclusions> 
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.11</artifactId>
    <version>2.3.1</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.3.1</version>
</dependency>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.11</artifactId>
    <version>2.3.1</version>
</dependency>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector-java_2.11</artifactId>
    <version>1.6.0-M1</version>
</dependency>  
</dependencies>

0 个答案:

没有答案