远程Spark作业中的java.lang.ClassCastException

时间:2019-02-09 15:36:45

标签: apache-spark-mllib classcastexception

我在提交作业以在纱上运行spark时遇到问题,我运行了一个简单的代码:

transactions = data3.map((Function<String, List<String>>) line -> Arrays.asList(line.split(" "))); FPGrowth fpg = new FPGrowth() .setMinSupport(minSupport) .setNumPartitions(10); FPGrowthModel<String> model2 = fpg.run(transactions);

我得到这个错误:

Caused by: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.fun$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1

我的问题是我在项目输出jar文件中使用了setjars,但是它不能解决此问题,有人找到了解决方案吗?需要上传的lambda jar文件在哪里?

0 个答案:

没有答案