初始化SparkContext时出错。 java.io.IOException:设备上没有剩余空间

时间:2016-01-28 15:07:21

标签: python apache-spark cluster-computing pyspark

执行shell命令后启动pyspark会话:pyspark --master yarn-client --num-executors 16 --driver-memory 16g --executor-memory 6g

我收到以下错误:

ERROR SparkContext: Error initializing SparkContext.
java.io.IOException: No space left on device  
at java.io.FileOutputStream.writeBytes(Native Method) 
at java.io.FileOutputStream.write(FileOutputStream.java:345)
at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:253) 
etc ...

似乎奴隶没有磁盘空间。如何清理?

编辑:我在虚拟机上运行df -h时想要启动作业时得到的结果:

Filesystem                 Size  Used Avail Use% Mounted on
/dev/mapper/rootvg01-lv01   20G   18G  2.5G  88% /
devtmpfs                    16G     0   16G   0% /dev
tmpfs                       16G  5.7G   10G  36% /dev/shm
tmpfs                       16G  1.6G   15G  11% /run
tmpfs                       16G     0   16G   0% /sys/fs/cgroup
/dev/mapper/rootvg01-lv03   20G  933M   19G   5% /var
/dev/mapper/rootvg01-lv02  2.0G   33M  2.0G   2% /tmp
/dev/sda1                  997M   92M  905M  10% /boot
/dev/mapper/rootvg01-lv04  1.6T  892G  734G  55% /data

0 个答案:

没有答案