任务序列化失败:java.lang.reflect.InvocationTargetException

时间:2015-05-26 18:35:24

标签: java apache-spark

我得到了例外:

  

任务序列化失败:java.lang.reflect.InvocationTargetException

我的代码是:

JDBCRDD jdbcRDD = new JDBCRDD(sc.sc(),Connection, getSchema(url), Table_Name, fields, new Filter[]{}, partitionList.toArray(new JDBCPartition[0])
        );

System.out.println("count before to Java RDD=" + jdbcRDD.cache().count());

JavaRDD<Row> jrdd = jdbcRDD.toJavaRDD();

System.out.println("count=" + jrdd.count());

jrdd.foreachPartition( (Iterator<Row> it)-> { 

                new DataPull().updateDB(it); 
  });

} 

类DataPull是可序列化的。

然而我正在

15/05/26 10:20:07 ERROR yarn.ApplicationMaster: User class threw exception: Job aborted due to stage failure: Task serialization failed: java.lang.reflect.InvocationTargetException
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$.getObjFieldValues$extension(SerializationDebugger.scala:240)
org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:150)
org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:99)
org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:158)
org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:99)
org.apache.spark.serializer.SerializationDebugger$.find(SerializationDebugger.scala:58)
org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:39)
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:80)
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:837)
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:778)
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:762)
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1362)
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

需要帮助解决此问题。

1 个答案:

答案 0 :(得分:2)

就我而言,snappy编解码器导致了这个问题。 尝试使用另一个来看看这是否能解决您的问题。

您可以编辑conf / spark-defaults.conf 并添加例如

spark.io.compression.codec      lzf