spark-submit和spark-shell结果不匹配

时间:2015-07-22 21:20:29

标签: apache-spark

我有一个简单的测试火花程序如下,奇怪的是它在spark-shell下运行良好,但会得到运行时错误

java.lang.NoSuchMethodError:

在spark-submit中,表示以下行:

val maps2=maps.collect.toMap

有问题。但为什么编译没有问题,它在spark-shell下运行良好?谢谢!

import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.mllib.feature.HashingTF
import org.apache.spark.mllib.linalg.Vector
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext
import org.apache.spark._
import SparkContext._


val docs=sc.parallelize(Array(Array("once" ,"upon", "a", "time"), Array("there", "was", "a", "king")))

val hashingTF = new HashingTF()

val maps=docs.flatMap{term=>term.map(ele=>(hashingTF.indexOf(ele),ele))}

val maps2=maps.collect.toMap

1 个答案:

答案 0 :(得分:0)

你应该看到哪个scala版本是spark编译的,我有同样的问题,sbt是2.11.7而spark是2.10。试试吧!