Spark提交失败,包含java.lang.NoSuchMethodError:scala.Predef $。$ conforms()Lscala / Predef $$ less $ colon $ less;

时间:2015-05-20 06:50:34

标签: java maven apache-spark cassandra-2.0

我正在使用spark 1.3.1 prebuild版本spark-1.3.1-bin-hadoop2.6.tgz

  

线程“main”中的异常java.lang.NoSuchMethodError:   scala.Predef $ $符合()Lscala / PREDEF $$少$ $结肠少。在   org.apache.spark.util.Utils $ .getSystemProperties(Utils.scala:1418)at at   org.apache.spark.SparkConf。(SparkConf.scala:58)at   org.apache.spark.SparkConf。(SparkConf.scala:52)at   com.zoho.zbi.Testing.test(Testing.java:43)at   com.zoho.zbi.Testing.main(Testing.java:39)使用Spark的默认log4j   profile:org / apache / spark / log4j-defaults.properties

我正在尝试一个简单的演示应用程序以保存到cassandra

SparkConf batchConf= new SparkConf()
            .setSparkHome(sparkHome)
            .setJars(jars)
            .setAppName(ZohoBIConstants.getAppName("cassandra"))//NO I18N
            .setMaster(master).set("spark.cassandra.connection.host", "localhost");

            JavaSparkContext sc = new JavaSparkContext(batchConf);
            // here we are going to save some data to Cassandra...
            List<Person> people = Arrays.asList(
                    Person.newInstance(1, "John", new Date()),
                    Person.newInstance(2, "Anna", new Date()),
                    Person.newInstance(3, "Andrew", new Date())
            );
//          Person test = Person.newInstance(1, "vini", new Date())''
            System.out.println("Inside Java API Demo : "+people);
            JavaRDD<Person> rdd = sc.parallelize(people);
            System.out.println("Inside Java API Demo rdd : "+rdd);
            javaFunctions(rdd).writerBuilder("test", "people", mapToRow(Person.class)).saveToCassandra();
            System.out.println("Stopping sc");
            sc.stop();

当我提交使用spark提交其工作

bin/spark-submit --class "abc.efg.Testing" --master spark://xyz:7077 /home/test/target/uber-Cassandra-0.0.1-SNAPSHOT.jar

这是我的pom

依赖

<dependencies>
  <!-- Scala -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
        <scope>compile</scope>
    </dependency>

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!-- END Scala -->
  <dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>18.0</version>
  </dependency>

  <dependency>
    <groupId>com.yammer.metrics</groupId>
    <artifactId>metrics-core</artifactId>
    <version>2.2.0</version>
  </dependency>

  <dependency>
    <groupId>junit</groupId>
    <artifactId>junit</artifactId>
    <version>3.8.1</version>
    <scope>test</scope>
  </dependency>

  <dependency>
    <groupId>javax.servlet</groupId>
    <artifactId>javax.servlet-api</artifactId>
    <version>3.1.0</version>
    <scope>provided</scope>
  </dependency>

  <dependency>
    <groupId>com.datastax.cassandra</groupId>
    <artifactId>cassandra-driver-core</artifactId>
    <version>2.1.5</version>
  </dependency>

  <dependency>
    <groupId>org.json</groupId>
    <artifactId>json</artifactId>
    <version>20090211</version>
  </dependency>
<!-- Cassandra Spark Connector dependency -->
  <dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>1.2.0</version>
  </dependency>
<!-- Cassandra java Connector dependency -->
  <dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector-java_2.10</artifactId>
    <version>1.2.0</version>
  </dependency> 

<!-- Spark Core dependency -->
        <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <version>1.3.1</version>
        </dependency>
    <!-- Spark dependency -->
        <dependency>
                 <groupId>org.apache.spark</groupId>
                 <artifactId>spark-streaming_2.11</artifactId>
                <version>1.3.1</version>
        </dependency>
    <!-- Spark dependency -->
        <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming-kafka_2.10</artifactId>
                <version>1.3.1</version>
        </dependency>
  </dependencies>

我使用

构建
<build>
      <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>
           <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.3</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>META-INF/*.SF</exclude>
                                <exclude>META-INF/*.DSA</exclude>
                                <exclude>META-INF/*.RSA</exclude>
                            </excludes>
                        </filter>
                    </filters>
                    <finalName>uber-${project.artifactId}-${project.version}</finalName>
                </configuration>
            </plugin>
           <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>

      </plugins>
    </build>

但是当我通过代码提交它不起作用时,任何帮助都很受欢迎..我尝试在pom中添加scala2.10.4道具仍然没有运气

我在eclipse中运行作为应用程序运行,所有master,spark home和jar设置为sparkConf,错误显示在sparkConf中

我的scala版本是

scala -version Scala代码运行器版本2.11.2 - 版权所有2002-2013,LAMP / EPFL

这与此问题有关吗?

如何切换到较旧版本的scala?在文档中说spark1.3.1支持scala 2.10.x版本,请告诉我如何解决这个问题

2 个答案:

答案 0 :(得分:19)

您遇到的问题是由于Scala版本中的不兼容性。使用较旧的Scala 2.10编译Prebuild Spark 1.3.1发行版,因为2.11不支持某些Spark依赖项,包括JDBC支持。

我建议使用Scala 2.10运行Spark群集。但是,如果您愿意,还可以通过以下方式使用Scala 2.11编译Spark包:

dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

答案 1 :(得分:0)

我在scala IDE中遇到了相同的问题。下面的步骤解决了该问题。

注意:-根据您的scala-spark检查兼容性。对我来说,它是scala版本-2.11。*与spark 2.4。*兼容

转到项目>>右键单击>>属性>> scala编译器>>选择“使用项目设置”选项>>并更改“ scala安装” >>应用>>应用并关闭.....

click on below image link to see setting of Scala IDE