无法在代码中向spark-cluster提交应用程序

时间:2016-12-18 16:42:00

标签: java scala apache-spark apache-spark-mllib

我的代码是:

SparkSession spark = SparkSession.builder().appName("plzzzz").master("local[*]").getOrCreate();
Dataset<Row> dataset = spark.read().format("libsvm").load("/home/ceny/tools/123.txt");
KMeans kmeans = new KMeans().setK(2).setSeed(1L);
KMeansModel model = kmeans.fit(dataset);//this line is 20

错误是:

16/12/19 00:08:49 WARN KMeans: The input data was not directly cached, which may hurt performance if its parent RDDs are also uncached.
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
    at org.apache.spark.ml.clustering.KMeansModel.transform(KMeans.scala:124)
    at org.apache.spark.ml.clustering.KMeans.fit(KMeans.scala:326)
    at Test.main(Test.java:20)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)

我的build.gradle是:

compile group: 'org.scala-lang', name: 'scala-reflect', version: '2.11.8'
compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.0.2'
compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.0.2'
compile group: 'org.apache.spark', name: 'spark-mllib_2.10', version: '2.0.2'

我的环境是:

  

火花版本:2.0.2

     

阶的版本:2.11.8

说明

我尝试在IDEA中执行以下代码,出现错误。 如果我将它打包为JAR,然后以termial方式提交,一切都很好。

如果我使用SparkSubmit.main()按代码提交jar,则仍会显示错误。

我现在该怎么办?

1 个答案:

答案 0 :(得分:3)

您需要为所有工件提供相同的Scala版本。它应该是:

compile group: 'org.apache.spark', name: 'spark-mllib_2.11', version: '2.0.2'