一些RDD操作因Java IllegalArgumentException而失败

时间:2019-06-21 11:00:14

标签: scala apache-spark

出于某种原因,在任何类型的RDD上执行某些(但不是全部)RDD动作时,都会出现Java IllegalArgumentException:抛出不受支持的类文件主要版本x。奇怪的是,这只会影响某些动作(例如,收集,采取,先执行等),而不会影响其他动作(例如,示例,takeOrdered等)。有什么问题的想法吗?

已安装的Spark版本为2.4.3,如果出现问题,我已经将JDK / JRE从11升级到了12。

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 12.0.1)

先前的版本抛出了“不支持的类文件主要版本55”,现在与升级版本相同,但版本为56(因此升级显然是成功的,但未解决的问题)。

以下是一个非常简单的RDD创建的输出,显示了RDD正在执行某些操作:

val seqNum = sc.parallelize(0 to 1000)
seqNum: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[37] at 
parallelize at <console>:24

seqNum.count
res30: Long = 1001

seqNum.sample(false, 0.01).foreach(println)
355
385
392
402
505
569
585

因此,RDD已创建并可以正常工作。但是,当使用完全相同的RDD和take操作时会发生以下情况:

seqNum.take(10).foreach(println)
java.lang.IllegalArgumentException: Unsupported class file major version 56
  at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
  at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
  at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
  at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
  at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
  at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
  at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
  at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
  at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
  at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
  at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
  at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
  at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
  at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
  at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
  at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
  at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
  at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
  at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
  at scala.collection.immutable.List.foreach(List.scala:392)
  at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
  at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
  at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
  at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
  at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
  at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
  ... 49 elided</code>

由于RDD已正确创建并且某些操作有效,所以我希望所有操作都能正常工作-知道问题是什么吗?

1 个答案:

答案 0 :(得分:0)

看起来Spark 2.4目前不支持Java 10/11。检查Jira链接是否相同 https://issues.apache.org/jira/browse/SPARK-24417 为确保作业正常运行,您可能需要使用JDK 8