在Google Cloud中使用Spark提交作业突然失败并显示以下消息:
ERROR org.apache.spark.SparkContext: Error initializing SparkContext.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /user/root/.sparkStaging/application_1461774669718_0005. Name node is in safe mode.
The reported blocks 0 needs additional 194 blocks to reach the threshold 0.9990 of total blocks 194.
The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
我正在使用的集群工作正常,直到最近,并没有对设置SparkContext的代码进行任何更改。似乎安全模式没有关闭是有原因的。知道如何调试吗?