NoClassDefFoundError:com / google / common / collect /我使用Spark时列出

时间:2017-06-18 19:08:20

标签: apache-spark

我正在尝试使用Apache Spark在多节点上测试我的代码。 在测试之前,我已经发现我的代码在我的本地模式下运行良好。 然后,我使用maven创建了Jar文件并使用shell脚本运行它,但是得到了下面的错误。

 17/06/19 03:59:59 INFO storage.BlockManagerMaster: Registered BlockManager
17/06/19 03:59:59 INFO client.AppClient$ClientEndpoint: Executor updated: app-20170619035958-0003/1 is now RUNNING
17/06/19 03:59:59 INFO client.AppClient$ClientEndpoint: Executor updated: app-20170619035958-0003/0 is now RUNNING
17/06/19 03:59:59 INFO client.AppClient$ClientEndpoint: Executor updated: app-20170619035958-0003/2 is now RUNNING
17/06/19 03:59:59 INFO client.AppClient$ClientEndpoint: Executor updated: app-20170619035958-0003/3 is now RUNNING
17/06/19 03:59:59 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/Lists
        at SparkPinkMST.DataSplitter.createPartitionFiles(DataSplitter.java:243)
        at SparkPinkMST.Pink.main(Pink.java:64)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Lists
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 11 more

在得到上述错误后,我用Google搜索并在我的pom.xml中添加了google guava依赖项。但结果是一样的。我的问题是错误只发生在Spark平台上的多节点上。 如何解决这个问题?

任何帮助将不胜感激。

0 个答案:

没有答案