Spark Memory错误

时间:2018-03-08 13:08:58

标签: apache-spark

我在Spark 1.5中遇到以下错误:

 Diagnostics: Container [pid=19554,containerID=container_e94_1518800506024_42837_02_000017] is running beyond physical memory limits. Current usage: 3.5 GB of 3.5 GB physical memory used; 4.3 GB of 7.3 GB virtual memory used. Killing container. Dump of the process-tree for container_e94_1518800506024_42837_02_000017
  

MASTER_URL =纱簇

     

NUM_EXECUTORS = 10

     

EXECUTOR_MEMORY = 4G

     

EXECUTOR_CORES = 6

     

DRIVER_MEMORY = 3G

应用程序读取的数据是7MB的avro文件,但在spark应用程序中有多次写入。

作业配置有问题吗?

0 个答案:

没有答案