我看到超出了GC开销限制

时间:2019-05-21 16:10:22

标签: python apache-spark pyspark

当我使用pyspark-2.4.0 python库运行spark程序时,我想获得以下错误的帮助。它可以长时间运行,然后我看到以下内容

ERROR Executor:91 - Exception in task 712.0 in stage 349.0 (TID 61032)
java.lang.OutOfMemoryError: GC overhead limit exceeded
    at java.io.ObjectInputStream$HandleTable.markDependency(ObjectInputStream.java:3633)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:423)
    at scala.collection.immutable.HashMap$SerializationProxy.readObject(HashMap.scala:582)
    at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)

0 个答案:

没有答案