为什么JavaNGramExample会因“java.lang.ClassNotFoundException:scala.collection.GenTraversableOnce $ class”而失败?

时间:2017-12-03 08:45:26

标签: java apache-spark apache-spark-mllib

我正在火花

中尝试一个简单的NGram示例

https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaNGramExample.java

这是我的pom依赖

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>

以下是示例代码

public class App {
    public static void main(String[] args) {
        System.out.println("Hello World!");

        System.setProperty("hadoop.home.dir", "D:\\del");

         SparkSession spark = SparkSession
                  .builder()
                  .appName("JavaNGramExample").config("spark.master", "local")
                  .getOrCreate();


         List<Row> data = Arrays.asList(RowFactory.create(0, Arrays.asList("car", "killed", "cat")),
                    RowFactory.create(1, Arrays.asList("train", "killed", "cat")),
                    RowFactory.create(2, Arrays.asList("john", "plays", "cricket")),
                    RowFactory.create(3, Arrays.asList("tom", "likes", "mangoes")));


        StructType schema = new StructType(new StructField[] {
                new StructField("id", DataTypes.IntegerType, false, Metadata.empty()),
                new StructField("words", DataTypes.createArrayType(DataTypes.StringType), false, Metadata.empty()) });

        Dataset<Row> wordDataFrame = spark.createDataFrame(data, schema);

        NGram ngramTransformer = new NGram().setN(2).setInputCol("words").setOutputCol("ngrams");

        Dataset<Row> ngramDataFrame = ngramTransformer.transform(wordDataFrame);
        System.out.println(" DISPLAY NGRAMS ");
        ngramDataFrame.select("ngrams").show(false);


    }
}

运行此代码时出现以下错误。

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at org.apache.spark.sql.types.StructType.<init>(StructType.scala:98)
    at com.mypackage.spark.learnspark.App.main(App.java:61)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more

我检查了scala的依赖关系,它是scala-library-2.11.8

spark 2.2.0和我的scala jar之间是否有任何不一致之处?

1 个答案:

答案 0 :(得分:1)

tl; dr spark-mllib_2.10更改为spark-mllib_2.11,以便Scala 2.11.8用于Spark MLlib依赖项(并可选择删除spark-core_2.11依赖项)。

查看您的pom.xml

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>
    来自Spark spark-core_2.11
  1. 2.2.0取决于Scala 2.11.8,而且没问题。

  2. 来自Spark spark-mllib_2.10
  3. 2.2.0取决于两个不同且不兼容的Scala版本2.10.x2.11.8。这是问题的根本原因。

  4. 请务必使用:

    1. 您的Spark依赖项artifactId的相同后缀,即spark-core_2.11spark-mllib_2.11(请注意,我将其更改为2.11)。

    2. 每个Spark依赖项中都有version