我有一个HDInsights Spark Cluster。我使用脚本操作安装了tensorflow。安装顺利(成功)。
但是现在当我去创建一个Jupyter笔记本时,我得到了:
import tensorflow
Starting Spark application
The code failed because of a fatal error:
Session 8 unexpectedly reached final status 'dead'. See logs:
YARN Diagnostics:
Application killed by user..
Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context. For instructions on how to assign resources see http://go.microsoft.com/fwlink/?LinkId=717038
b) Contact your cluster administrator to make sure the Spark magics library is configured correctly.
我不知道如何解决这个错误......我尝试了一些事情,比如查看日志,但他们没有帮助。
我只想连接到我的数据并使用tensorflow训练模型。
答案 0 :(得分:1)
这看起来像Spark应用程序资源的错误。检查群集上的可用资源,并关闭所有不需要的应用程序。请在此处查看更多详细信息:https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-resource-manager#kill-running-applications