org / spark_project / guava / cache / CacheLoader

时间:2017-11-12 02:11:32

标签: scala apache-spark

我正在尝试使用Maven在Scala IDE中为Eclipse运行Spark程序。但是,我在初始化SparkConf的行上收到java.lang.NoClassDefFoundError错误。我还尝试为Guava 14.0.1添加一个依赖项,但也没有解决我的问题。 :

Exception in thread "main" java.lang.NoClassDefFoundError: org/spark_project/guava/cache/CacheLoader
    at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
    at oursparkapp2.SimpleApp$.main(SimpleApp.scala:8)
    at oursparkapp2.SimpleApp.main(SimpleApp.scala)
Caused by: java.lang.ClassNotFoundException:   org.spark_project.guava.cache.CacheLoader
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 5 more 

我尝试运行的scala程序(SimpleApp.scala)如下:

package oursparkapp2

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext

object SimpleApp {
  def main(args:Array[String]) {
    val conf = new SparkConf().setAppName("Hlelo")
  }
}

我的pom.xml文件如下:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>ez.spark</groupId>
  <artifactId>oursparkapp2</artifactId>
  <version>0.0.1-SNAPSHOT</version>

  <dependencies>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.8.0</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
        <scope>provided</scope>
    </dependency>

  </dependencies>
</project>

此外,Spark使用spark-shell命令在我的终端上正常运行。

2 个答案:

答案 0 :(得分:1)

我通过添加依赖项解决了这个问题:

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-network-common_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>

答案 1 :(得分:0)

我有同样的问题。我通过将Spark从2.0.0升级到2.2.0解决了这个问题。