我正在火花
中尝试一个简单的NGram示例这是我的pom依赖
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
以下是示例代码
public class App {
public static void main(String[] args) {
System.out.println("Hello World!");
System.setProperty("hadoop.home.dir", "D:\\del");
SparkSession spark = SparkSession
.builder()
.appName("JavaNGramExample").config("spark.master", "local")
.getOrCreate();
List<Row> data = Arrays.asList(RowFactory.create(0, Arrays.asList("car", "killed", "cat")),
RowFactory.create(1, Arrays.asList("train", "killed", "cat")),
RowFactory.create(2, Arrays.asList("john", "plays", "cricket")),
RowFactory.create(3, Arrays.asList("tom", "likes", "mangoes")));
StructType schema = new StructType(new StructField[] {
new StructField("id", DataTypes.IntegerType, false, Metadata.empty()),
new StructField("words", DataTypes.createArrayType(DataTypes.StringType), false, Metadata.empty()) });
Dataset<Row> wordDataFrame = spark.createDataFrame(data, schema);
NGram ngramTransformer = new NGram().setN(2).setInputCol("words").setOutputCol("ngrams");
Dataset<Row> ngramDataFrame = ngramTransformer.transform(wordDataFrame);
System.out.println(" DISPLAY NGRAMS ");
ngramDataFrame.select("ngrams").show(false);
}
}
运行此代码时出现以下错误。
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at org.apache.spark.sql.types.StructType.<init>(StructType.scala:98)
at com.mypackage.spark.learnspark.App.main(App.java:61)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
我检查了scala的依赖关系,它是scala-library-2.11.8
spark 2.2.0和我的scala jar之间是否有任何不一致之处?
答案 0 :(得分:1)
tl; dr 将spark-mllib_2.10
更改为spark-mllib_2.11
,以便Scala 2.11.8用于Spark MLlib依赖项(并可选择删除spark-core_2.11
依赖项)。
查看您的pom.xml
:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
spark-core_2.11
的 2.2.0
取决于Scala 2.11.8
,而且没问题。
spark-mllib_2.10
的 2.2.0
取决于两个不同且不兼容的Scala版本2.10.x
和2.11.8
。这是问题的根本原因。
请务必使用:
您的Spark依赖项artifactId
的相同后缀,即spark-core_2.11
和spark-mllib_2.11
(请注意,我将其更改为2.11
)。
每个Spark依赖项中都有version
。