Spark Worker节点自动停止

时间:2016-01-13 08:03:22

标签: java apache-spark

我正在运行Spark Standalone群集,在提交应用程序时,火花驱动程序因以下错误而停止。

16/01/12 23:26:14 INFO Worker: Asked to kill executor app-20160112232613-0012/0
16/01/12 23:26:14 INFO ExecutorRunner: Runner thread for executor app-20160112232613-0012/0 interrupted
16/01/12 23:26:14 INFO ExecutorRunner: Killing process!
16/01/12 23:26:14 ERROR FileAppender: Error writing stream to file /spark/spark-1.4.1/work/app-20160112232613-0012/0/stderr
java.io.IOException: Stream closed
        at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:283)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
        at java.io.FilterInputStream.read(FilterInputStream.java:107)
        at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
        at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
        at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
        at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
        at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
16/01/12 23:26:14 INFO Worker: Executor app-20160112232613-0012/0 finished with state KILLED exitStatus 143
16/01/12 23:26:14 INFO Worker: Cleaning up local directories for application app-20160112232613-0012

我是Spark及其处理的新手。请帮助我。

2 个答案:

答案 0 :(得分:4)

错误不是由java.io.IOException引起的,因为您可以清楚地看到16/01/12 23:26:14 INFO Worker: Asked to kill executor app-20160112232613-0012/0。之后当spark尝试写入日志文件时会引发此异常,您还将在其中查看错误原因。

即使您使用root权限spark-submit运行,也是spark用户编写文件。我猜你这台笔记本电脑上正在运行。尝试在spark文件夹上运行sudo chmod -R 777

答案 1 :(得分:2)

在我的情况下,问题是,spark驱动程序无法从提交的可执行jar中获取依赖项。合并所有依赖项并将它们转换为单个可执行文件。它解决了这个问题。

请遵守我的术语:)