在Spark中运行迭代程序(Java)时出现Stackoverflow错误

时间:2015-08-04 10:54:10

标签: java apache-spark cluster-analysis

我试图在Spark中实现分层聚合聚类算法。

在本地模式下运行,当输入数据很大时,总是抛出<td>&nbsp;</td>

据说(例如Cut off the super long serialization chain),这是因为序列化链变得太长。所以我每隔几次迭代就添加了检查点,但它仍然不起作用,其他人是否遇到过这个问题。

我的代码是:

java.lang.StackOverflowError

stacktrace的第一部分

JavaRDD<LocalCluster> run(JavaRDD<Document> documents)
{
    @SuppressWarnings("resource")
    JavaSparkContext sc = new JavaSparkContext(documents.context());
    Broadcast<Double> mergeCriterionBroadcast = sc.broadcast(mergeCriterion);
    Accumulable<Long, Long> clusterIdAccumulator = sc.accumulable(documents.map(doc -> doc.getId()).max(Comparator.naturalOrder()), new LongAccumulableParam());

    // Clusters.
    JavaRDD<LocalCluster> clusters = documents.map(document -> 
    {
        return LocalCluster.of(document.getId(), Arrays.asList(document));          
    }).cache();

    // Calculate clusters pair-wise similarity, initially, each document forms a cluster.
    JavaPairRDD<Tuple2<LocalCluster, LocalCluster>, Double> similarities = clusters.cartesian(clusters).filter(clusterPair -> (clusterPair._1().getId() < clusterPair._2().getId()))
    .mapToPair(clusterPair ->
    {
        return new Tuple2<Tuple2<LocalCluster, LocalCluster>, Double>(clusterPair, LocalCluster.getSimilarity(clusterPair._1(), clusterPair._2()));
    })
    .filter(tuple -> (tuple._2() >= mergeCriterionBroadcast.value())).cache();

    // Merge the most similar two clusters.
    long count = similarities.count();
    int loops = 0;
    while (count > 0)
    {
        System.out.println("Count: " + count);
        Tuple2<Tuple2<LocalCluster, LocalCluster>, Double> mostSimilar = similarities.max(SerializableComparator.serialize((a, b) -> Double.compare(a._2(), b._2())));

        Broadcast<Tuple2<Long, Long>> MOST_SIMILAR = sc.broadcast(new Tuple2<Long, Long>(mostSimilar._1()._1().getId(), mostSimilar._1()._2().getId()));

        clusterIdAccumulator.add(1L);
        LocalCluster newCluser = LocalCluster.merge(mostSimilar._1()._1(), mostSimilar._1()._2(), clusterIdAccumulator.value());

        JavaRDD<LocalCluster> newClusterRDD = sc.parallelize(Arrays.asList(newCluser));
        JavaRDD<LocalCluster> filteredClusters = clusters.filter(cluster -> (cluster.getId() != MOST_SIMILAR.value()._1() && cluster.getId() != MOST_SIMILAR.value()._2()));

        JavaPairRDD<Tuple2<LocalCluster, LocalCluster>, Double> newSimilarities = filteredClusters.cartesian(newClusterRDD)
        .mapToPair(clusterPair ->
        {
            return new Tuple2<Tuple2<LocalCluster, LocalCluster>, Double>(clusterPair, LocalCluster.getSimilarity(clusterPair._1(), clusterPair._2()));
        })
        .filter(tuple -> (tuple._2() >= mergeCriterionBroadcast.value()));

        clusters = filteredClusters.union(newClusterRDD).coalesce(2).cache();
        similarities = similarities.filter(tuple -> 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._2()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._2()))
                .union(newSimilarities).coalesce(4).cache();
        if ((loops++) >= 2)
        {
            clusters.checkpoint();
            similarities.checkpoint();
            loops = 0;
        }

        count = similarities.count();
    }

    return clusters.filter(cluster -> cluster.getDocuments().size() > 1);
}

stacktrace的另一部分:

15/08/05 10:11:19 ERROR TaskSetManager: Failed to serialize task 4078, not attempting to retry it.

java.io.IOException: java.lang.StackOverflowError
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1242)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

在错误发生之前,有很多这样的警告。

Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

1 个答案:

答案 0 :(得分:3)

最后,我发现了问题:它是使用检查点

在Spark中,检查点允许用户截断RDD的谱系,但只有在通过操作计算RDD后才会创建检查点(如 collect ,计数)。

在我的程序中,没有在群集 RDD上调用任何操作,因此不会创建检查点,群集的序列化链仍会不断增长并导致 stackoverflower < / strong>终于。要解决此问题,请在 checkpoint 之后调用 count

        clusters = filteredClusters.union(newClusterRDD).coalesce(2).cache();
        similarities = similarities.filter(tuple -> 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._2()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._2()))
                .union(newSimilarities).coalesce(4).cache();
        if ((loops++) >= 50)
        {
            clusters.checkpoint();
            clusters.count();
            similarities.checkpoint();
            loops = 0;
        }