保存到镶木地板时发生火花错误:IndexError:从空双端队列弹出

时间:2020-07-09 18:24:24

标签: amazon-web-services apache-spark pyspark amazon-emr

使用AWS EMR服务在YARN上运行Spark集群时出现此错误:

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1159, in send_command
    raise Py4JNetworkError("Answer from Java side is empty")
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 985, in send_command
    response = connection.send_command(command)
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1164, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving
Traceback (most recent call last):
  File "process_ecommerce.py", line 131, in <module>
    cfg["partitions"]["info"]
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/__pyfiles__/spark_utils.py", line 10, in save_dataframe
    .parquet(path)
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/pyspark.zip/pyspark/sql/readwriter.py", line 844, in parquet
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/py4j-0.10.7-src.zip/py4j/protocol.py", line 336, in get_return_value
py4j.protocol.Py4JError: An error occurred while calling o343.parquet
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server (127.0.0.1:36063)
Traceback (most recent call last):
  File "/mnt/yarn/usercache/hadoop/appcache/application_1594292341949_0004/container_1594292341949_0004_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 929, in _get_connection
    connection = self.deque.pop()
IndexError: pop from an empty deque

我正在运行具有1个r5.2xlarge类型的主节点和20个从属节点的集群。每个都有8 cpus和64gb。 SPARK的配置为:

  • 20 GB执行器内存
  • 30 GB执行程序的内存开销
  • 每个执行者8个核心
  • 每个任务1 cpu

如何解决此错误?

0 个答案:

没有答案