我正在使用spark 2.1.0。我有2个数据帧不超过3 MB。当我尝试在2个数据帧上运行内连接时,我的所有转换逻辑都可以完美地运行。但是当我在2个数据帧上使用RightOuter连接时,我得到了错误。
错误
RN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used.
Consider boosting spark.yarn.executor.memoryOverhead.
17/08/02 02:29:53 ERROR cluster.YarnScheduler: Lost executor 337 on ip-172-
21-1-105.eu-west-1.compute.internal: Container killed by YARN for exceeding
memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting
spark.yarn.executor.memoryOverhead.
17/08/02 02:29:53 WARN scheduler.TaskSetManager: Lost task 34.0 in stage
283.0 (TID 11396, ip-172-21-1-105.eu-west-1.compute.internal, executor 337):
ExecutorLostFailure (executor 337 exited caused by one of the running tasks)
Reason: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5
GB physical memory used. Consider boosting
spark.yarn.executor.memoryOverhead.
17/08/02 02:29:53 WARN server.TransportChannelHandler: Exception in
connection from /172.21.1.105:50342
java.io.IOException: Connection reset by peer
我试过替代方案 1)df.coalesce(x值).show() 2)尝试设置执行程序内存没有任何工作。
此问题在过去几周内有待处理。谁能告诉我哪里出错了
答案 0 :(得分:0)
请您分享有关数据集的详细信息。
你试过leftOuterJoin,它是否也给你同样的错误。
此致
Neeraj