How to kill SparkR job

时间:2016-10-13 06:52:56

标签: apache-spark sparkr

I am using Spark 2.0.0 and I have web based RStudio through which I am using SparkR package.

While running a large program if I have to kill a job during process, How can I do that?

STOP button in R doesn't work and If I kill the session itself then all the objects created in that session also get removed.

What is the best way to do it?

2 个答案:

答案 0 :(得分:2)

由于R可能因为等待来自Spark的响应而阻塞,因此最合适的方式可能是访问WebUI(如果可以访问它)并终止当前舞台。

打开 WebUI(默认端口为8080),然后单击 SparkR ,这是应用程序名称。 Spark Master WebUI 现在您处于 SparkR应用程序UI 中。点击阶段,然后按(杀死)来杀死活动阶段。这当然不会杀死一切,只有活跃阶段和其他阶段也可能需要被杀死。 Spark Application WebUI

答案 1 :(得分:0)

您可以简单地写: sparkR.session.stop()

然后重启R:

的会话

会话 - >重启R