Spark:有没有办法打印出spark-shell和spark的类路径?

时间:2015-05-28 16:49:17

标签: scala apache-spark

我可以在spark-shell中成功运行一个spark作业但是当它的包通过spark-submit运行时会得到一个NoSuchMethodError。

这向我表明了类路径的某种不匹配。有没有办法可以比较两个类路径?某种日志声明?

谢谢!

15/05/28 12:46:46 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
    at com.ldamodel.LdaModel$$anonfun$5$$anonfun$apply$5.apply(LdaModel.scala:22)
    at com.ldamodel.LdaModel$$anonfun$5$$anonfun$apply$5.apply(LdaModel.scala:22)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at com.ldamodel.LdaModel$$anonfun$5.apply(LdaModel.scala:22)
    at com.ldamodel.LdaModel$$anonfun$5.apply(LdaModel.scala:22)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:202)
    at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:56)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
    at org.apache.spark.scheduler.Task.run(Task.scala:64)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

4 个答案:

答案 0 :(得分:26)

我认为this should work

{{1}}

答案 1 :(得分:19)

不修改代码:

SPARK_PRINT_LAUNCH_COMMAND=true /usr/lib/spark/bin/spark-shell

也适用于spark-submit

答案 2 :(得分:5)

这应该可以完成,而无需更改任何代码:

--conf 'spark.driver.extraJavaOptions=-verbose:class'
--conf 'spark.executor.extraJavaOptions=-verbose:class'

答案 3 :(得分:-3)

/opt/spark/bin/compute-classpath.sh