任务不可序列化异常 - 在Spark foreach中使用JMSTemplate时

时间:2015-12-29 14:46:21

标签: serialization apache-spark jmstemplate

我正在尝试在rdd.foreach方法中使用Spring JMSTemplate类,但我收到Task Not Serializable错误。 当我尝试使用静态变量时,它在本地工作,但在集群中,我得到空指针异常。

示例代码:

inputRDD.foreach(record -> {

                  messageServices.send(record);   
}

错误日志:

org.apache.spark.SparkException: Task not serializable
       at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:315)
       at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:305)
       at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
       at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
       at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:869)
       at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:868)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
       at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
       at org.apache.spark.rdd.RDD.foreach(RDD.scala:868)
       at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:327)
       at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:47)
       at com.messenger.MessengerDriver.runJob(MessengerDriver.java:108)
       at com.messenger.MessengerDriver.main(MessengerDriver.java:60)
Caused by: java.io.NotSerializableException: org.springframework.jms.core.JmsTemplate
Serialization stack:
       - object not serializable (class: org.springframework.jms.core.JmsTemplate, value: org.springframework.jms.core.JmsTemplate@3b98b809)
       - field (class: com.messenger.Messenger.activemq.MessageProducer, name: jmsTemplate, type: class org.springframework.jms.core.JmsTemplate)
       - object (class com.messenger.Messenger.activemq.MessageProducer, com.messenger.Messenger.activemq.MessageProducer@662e682a)
       - field (class: java.lang.invoke.SerializedLambda, name: capturedArgs, type: class [Ljava.lang.Object;)
       - field (class: org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1, name: f$14, type: interface org.apache.spark.api.java.function.VoidFunction)
       - object (class org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1, <function1>)
       at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
       at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
       at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:81)
       at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:312)
       ... 13 more

有没有人遇到同样的问题?

1 个答案:

答案 0 :(得分:0)

正确的模式是使用重新分配&amp; mapPartitions。
repartition是将RDD映射到合适大小的分区;
mapPartitions是单独处理每个分区,您可以为传递函数内的每个分区创建JMSTemplate。