Spark升级到1.5.1在运行时抛出异常

时间:2015-10-08 05:09:29

标签: apache-spark

我升级到Spark 1.5.1并在使用RDD.map()时遇到问题。我得到以下异常:

Exception in thread "main" java.lang.IllegalArgumentException
at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:44)
at org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:81)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:187)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2030)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:314)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:313)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
at org.apache.spark.rdd.RDD.map(RDD.scala:313)
at com.framedobjects.ClickInvestigation$.main(ClickInvestigation.scala:17)
at com.framedobjects.ClickInvestigation.main(ClickInvestigation.scala)

将RDD [String]映射到RDD [CounterRecord]时抛出错误:

val counterRDD = counterTextRDD.map(mapToCounter(_))

我的build.sbt看起来像

name := "exploring-spark"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "1.5.1" withSources,
                        "net.liftweb" %% "lift-json" % "2.6",
                        "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test",
                        "joda-time" % "joda-time" % "2.8.2",
                        "org.yaml" % "snakeyaml" % "1.16",
                        "com.github.seratch" %% "awscala" % "0.3.+" withSources,
                        "org.apache.devicemap" % "devicemap-client" % "1.1.0",
                        "org.apache.devicemap" % "devicemap-data" % "1.0.3")

我的感觉是有一些版本不匹配(ASM?)但我无法查明问题。我针对Java 1.8编译并运行1.8.0_40。有什么想法吗?

进一步调查显示这是Eclipse(Mars)和Scala-IDE的问题。我可以在spark-shell v1.5.0中运行代码。

2 个答案:

答案 0 :(得分:2)

在我的例子中,将scala编译器目标更改为jvm 1.7解决了这个问题。

答案 1 :(得分:1)

将scala目标版本更改为1.7可以使用。这是作为spark 1.6.0的一部分修复的。 https://issues.apache.org/jira/browse/SPARK-6152。他们完全从外部删除了asm依赖