Spark任务序列化和关闭

时间:2016-12-04 00:59:15

标签: scala serialization apache-spark closures rdd

当传递给Spark RDD操作的lambda引用其作用域之外的对象时,它将包含创建用于分布式执行的序列化任务的必要上下文。在下面的简单示例中,为什么它决定序列化整个OutClass实例,而不仅仅是乘数?我怀疑乘数实际上是一个Scala getter方法,所以它必须包括类的引用。声明OuterClass扩展Serializable将起作用,但它引入了不必要的约束。我真的很感激有一种方法可以让它工作而不会声明OuterClass可序列化。

object ClosureTest {
  def main(args: Array[String]): Unit = {
    val sc = SparkContext.getOrCreate(new SparkConf().setMaster("local[2]").setAppName("test"))
    println(new OuterClass(10).sparkSumProd(sc.parallelize(Seq(1,2,3))))
  }
  class OuterClass(multiplier: Int) {
    def sparkSumProd(data: RDD[Int]): Double = {
      data.map{
        v => v * multiplier
      }.sum()
    }
  }
}

以下是Spark的SerializationDebugger

的输出
Exception in thread "main" org.apache.spark.SparkException: Task not serializable
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2056)
    at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:366)
    at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:365)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
    at org.apache.spark.rdd.RDD.map(RDD.scala:365)
    at ClosureTest$OuterClass.sparkSumProd(ClosureTest.scala:14)
    at ClosureTest$.main(ClosureTest.scala:10)
    at ClosureTest.main(ClosureTest.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.io.NotSerializableException: ClosureTest$OuterClass
Serialization stack:
    - object not serializable (class: ClosureTest$OuterClass, value: ClosureTest$OuterClass@36a7abe1)
    - field (class: ClosureTest$OuterClass$$anonfun$sparkSumProd$1, name: $outer, type: class ClosureTest$OuterClass)
    - object (class ClosureTest$OuterClass$$anonfun$sparkSumProd$1, <function1>)
    at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
    at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
    at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)
    ... 17 more

0 个答案:

没有答案