Scala:Xtream抱怨对象不可序列化

时间:2016-09-12 19:57:42

标签: scala apache-spark xml-deserialization

我定义了以下案例类,我想使用xstream以xml格式打印出ClientData

case class Address(addressLine1: String,
                   addressLine2: String,
                   city: String,
                   provinceCode: String,
                   country: String,
                   addressTypeDesc: String) extends Serializable{

}

case class ClientData(title: String,
                      firstName: String,
                      lastName: String,
                      addrList:Option[List[Address]]) extends Serializable{

}


object ex1{
    def main(args: Array[String]){
    ...
    ...
    ...

    // In below, x is Try[ClientData]
    val xstream = new XStream(new DomDriver)
newClientRecord.foreach(x=> if (x.isSuccess) println(xstream.toXML(x.get)))

    }
}

当程序执行行以xml格式打印每个ClientData时,我收到下面的运行时错误。请帮忙。

Exception in thread "main" org.apache.spark.SparkException: Task not serializable
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2055)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:911)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:910)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    at org.apache.spark.rdd.RDD.foreach(RDD.scala:910)
    at lab9$.main(lab9.scala:63)
    at lab9.main(lab9.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.io.NotSerializableException: com.thoughtworks.xstream.XStream
Serialization stack:
    - object not serializable (class: com.thoughtworks.xstream.XStream, value: com.thoughtworks.xstream.XStream@51e94b7d)
    - field (class: lab9$$anonfun$main$1, name: xstream$1, type: class com.thoughtworks.xstream.XStream)
    - object (class lab9$$anonfun$main$1, <function1>)
    at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
    at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
    at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)
    ... 16 more

1 个答案:

答案 0 :(得分:0)

不是XStream抱怨,它是Spark。您需要在任务中定义xstream变量:

newClientRecord.foreach { x=> 
  if (x.isSuccess) {
    val xstream = new XStream(new DomDriver)
    println(xstream.toXML(x.get)) 
  }
}

如果XStream足够便宜而无法创建;

newClientRecord.foreachPartition { xs => 
  val xstream = new XStream(new DomDriver)
  xs.foreach { x =>
    if (x.isSuccess) {
      println(xstream.toXML(x.get)) 
    }
  }
}

否则。