scala代码适用于spark-shell但不适用于spark-submit

时间:2015-08-05 08:48:47

标签: scala apache-spark

以下是主要的scala代码

1.val conf=new SparkConf()
2.conf.setMaster("spark://master:7077")
3.conf.setAppName("Commnity Detective")
4.val sc=new SparkContext(conf)
5.val rdd=sc.textFile("hdfs://master:9000/lvcy/test/ungraph/test.txt")
6.val maprdd=rdd.map(line =>{val p=line.split("\\s+");(p(0),p(1))}) union rdd.map( line =>{val p=line.split("\\s+");(p(1),p(0))})
7.val reducerdd=maprdd.reduceByKey((a,b)=>a+"\t"+b)
8.val reduceArray=reducerdd.collect()
9.val reducemap=reduceArray.toMap

问题陈述:

  1. 复制在spark-shell上运行的代码(第5-9行),结果是正确的
  2. 如果将代码放入Eclipse并生成jar包,则使用&#34; spark-submit&#34;要提交作业,有下一个错误(&#34; Main:scala:21&#34;是顶行:9,也就是说地图错误的方法,为什么?):< / p>

    Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
    at net.lvcy.main.Main$.main(Main.scala:21)
    at net.lvcy.main.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    

2 个答案:

答案 0 :(得分:1)

看起来像Scala版本不匹配。您应该确保用于生成jar的Scala版本与Spark集群二进制文件的Scala版本相同,例如2.10

答案 1 :(得分:0)

使用Scala 2.10编译Prebuild Spark发行版,因此请确保您在scala 2.10下运行spark集群。