火花错误:ClassNotFoundException:scala.Cloneable

时间:2019-07-30 03:19:10

标签: scala apache-spark

我使用的IDE是IEAD最新版本。 sbt的版本是1.2.8,Scala的版本是2.13.0。 有我的pom.xml:

<dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>${hadoop.version}</version>
        </dependency>

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>2.4.3</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

我正在尝试将那些依赖项添加到我的应用程序中,并更改那些依赖关系的版本。但是错误仍然存​​在。

<dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.13.0</version>
        </dependency>

        <dependency>
            <groupId>com.typesafe.akka</groupId>
            <artifactId>akka-actor_2.11</artifactId>
            <version>2.3.12</version>
        </dependency>

我已经运行了以下代码:

object HelloWorld {

  def connect(): SparkContext = {
    def appName: String = "appName"

//    def master: String = "spark://10.21.49.2:7077"
    def master = "local"
    System.setProperty("hadoop.home.dir", "G:\\JavaWeb\\hadoop-2.7.1")
    val conf = new SparkConf().setAppName(appName).setMaster(master)
    new SparkContext(conf)
  }

  def main(args: Array[String]): Unit = {
    def sc : SparkContext = connect()

    sc.parallelize(Array(2,3))

  }
}

但是我得到那些例外:

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at HelloWorld$.connect(HelloWorld.scala:11)
    at HelloWorld$.sc$1(HelloWorld.scala:16)
    at HelloWorld$.main(HelloWorld.scala:18)
    at HelloWorld.main(HelloWorld.scala)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 16 more

谢谢,我将scala-library的依赖项版本降级到2.12.x。并且下面有新的例外。

Exception in thread "main" java.lang.NoSuchMethodError: scala.util.matching.Regex.<init>(Ljava/lang/String;Lscala/collection/Seq;)V
    at scala.collection.immutable.StringLike.r(StringLike.scala:281)
    at scala.collection.immutable.StringLike.r$(StringLike.scala:281)
    at scala.collection.immutable.StringOps.r(StringOps.scala:29)
    at scala.collection.immutable.StringLike.r(StringLike.scala:270)
    at scala.collection.immutable.StringLike.r$(StringLike.scala:270)
    at scala.collection.immutable.StringOps.r(StringOps.scala:29)
    at org.apache.spark.util.Utils$.<init>(Utils.scala:1427)
    at org.apache.spark.util.Utils$.<clinit>(Utils.scala)
    at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
    at HelloWorld$.connect(HelloWorld.scala:11)
    at HelloWorld$.sc$1(HelloWorld.scala:16)
    at HelloWorld$.main(HelloWorld.scala:17)
    at HelloWorld.main(HelloWorld.scala)

2 个答案:

答案 0 :(得分:2)

好,我发现了this

  

Spark可在Java 8 +,Python 2.7 + / 3.4 +和R 3.1+上运行。对于Scala API,Spark 2.4.3使用Scala 2.12。您将需要使用兼容的Scala版本(2.12.x)。

您应该使用scala 2.12,而不是最近发布的2.13,因此spark不支持。

答案 1 :(得分:1)

明确表明spark scala的版本不匹配。
文档说...

Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.3 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).

因此,需要降级以下依赖项的版本

<dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.13.0</version>
        </dependency>