org.apache.spark.SparkException:任务不可序列化的java

时间:2016-09-20 07:29:23

标签: java serialization apache-spark

我正在尝试将结果添加到mysql thorugh foreachpartition,但是收到错误org.apache.spark.SparkException:Task is serializable java。

public class Insert实现Serializable {

 transient static JavaSparkContext spc;
public static void main(String gg[]) 
{

 Map<String, String> options = new HashMap<String, String>();
        options.put("url","jdbc:mysql://localhost:3306/testing?user=root&password=pwd");
        options.put("dbtable", "rtl");
 SparkConf ss=new SparkConf().setAppName("insert").setMaster("local");

 spc=new JavaSparkContext(ss);

    JavaRDD<String> rbm=spc.textFile(path);
    // DataFrame jdbcDF = sqlContext.jdbc(options.get("url"),options.get("dbtable"));

    // System.out.println("Data------------------->" + jdbcDF.toJSON().first());


 JavaRDD<String> file=rbm.flatMap(new FlatMapFunction<String, String>() {
NotSerializableException nn=new NotSerializableException();
    public Iterable<String> call(String x)  {
        // TODO Auto-generated method stub

        return Arrays.asList(x.split("  ")[0]);
    }
});



try {
    file.foreachPartition(new VoidFunction<Iterator<String>>()   {
    Connection conn= (Connection) DriverManager.getConnection("jdbc:mysql://localhost/testing","root","amd@123");

        PreparedStatement del = (PreparedStatement) conn.prepareStatement ("INSERT INTO rtl (rtl_s) VALUES (?) ");
        NotSerializableException nn=new NotSerializableException();
            public void call(Iterator<String> x) throws Exception {
                // TODO Auto-generated method stub
    while(x.hasNext())
    {
                String y=x.toString();
                del.setString(1, y);
                del.executeUpdate();
    }
            }

    });
} catch (Exception e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
}
}

我收到以下错误

6/09/20 12:37:58 INFO SparkContext: Created broadcast 0 from textFile at Insert.java:41
org.apache.spark.SparkException: Task not serializable
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2055)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:919)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:918)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918)
    at org.apache.spark.api.java.JavaRDDLike$class.foreachPartition(JavaRDDLike.scala:225)
    at org.apache.spark.api.java.AbstractJavaRDDLike.foreachPartition(JavaRDDLike.scala:46)
    at final_file.Insert.main(Insert.java:59)
Caused by: java.io.NotSerializableException: java.lang.Object
Serialization stack:
    - object not serializable (class: java.lang.Object, value: java.lang.Object@4395342)
    - writeObject data (class: java.util.HashMap)
    - object (class java.util.HashMap, {UTF-8=java.lang.Object@4395342, WINDOWS-1252=com.mysql.jdbc.SingleByteCharsetConverter@72ffabab, US-ASCII=com.mysql.jdbc.SingleByteCharsetConverter@6f5fa288})
    - field (class: com.mysql.jdbc.ConnectionImpl, name: charsetConverterMap, type: interface java.util.Map)
    - object (class com.mysql.jdbc.JDBC4Connection, com.mysql.jdbc.JDBC4Connection@6761e52a)
    - field (class: final_file.Insert$2, name: conn, type: interface com.mysql.jdbc.Connection)
    - object (class final_file.Insert$2, final_file.Insert$2@45436e66)
    - field (class: org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, name: f$12, type: interface org.apache.spark.api.java.function.VoidFunction)
    - object (class org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, <function1>)
    at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
    at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
    at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)
    ... 12 more

我在尝试将结果更新为mysql时遇到上述错误。

2 个答案:

答案 0 :(得分:0)

> The issue has been resolved after changing in try block.
> 
> try {
>     file.foreachPartition(new VoidFunction<Iterator<String>>()   {
> 
>         NotSerializableException nn=new NotSerializableException();
>             public void call(Iterator<String> x) throws Exception {
>                 // TODO Auto-generated method stub
>     Connection conn= (Connection) DriverManager.getConnection("jdbc:mysql://localhost/testing","root","amd@123");
> 
>         PreparedStatement del = (PreparedStatement) conn.prepareStatement ("INSERT INTO rtl (rtl_s) VALUES (?) ");
>     while(x.hasNext())
>     {
>                 String y=x.toString();
>                 del.setString(1, y);
>                 del.executeUpdate();
>     }
>             }
> 
>     }); } catch (Exception e) {
>     // TODO Auto-generated catch block
>     e.printStackTrace(); }

答案 1 :(得分:0)

当您使用spark的某些操作方法(例如map,flapMap ...)时,spark会尝试序列化您使用的所有函数,方法和字段。

但是方法和字段无法序列化,因此整个类方法或字段都将被蜜蜂序列化。

如果这些类未实现java.io.seializable,则发生此异常。 您可以通过搜索序列化路径找到NotSerializableException在哪里遇到的问题。

就您而言,您可以在这里查看:

Caused by: java.io.NotSerializableException: java.lang.Object
Serialization stack:
    - object not serializable (class: java.lang.Object, value: java.lang.Object@4395342)