Databricks PySpark作业不断被取消

时间:2019-01-03 11:44:40

标签: pyspark databricks azure-databricks

我正在Azure上使用Databricks笔记本,并且我有一个非常出色的Pyspark笔记本,该笔记本昨天整天运行良好。但是到了一天结束时,我发现我在以前知道的代码上遇到了一个奇怪的错误:org.apache.spark.SparkException: Job aborted due to stage failure: Task from application

但是因为很晚,我才把它留到今天。今天,我尝试创建一个新的集群来运行代码,但是这一次一直在说我的工作“已取消”

实际上我只是尝试运行1行代码:

filePath = "/SalesData.csv"

甚至被取消。

编辑:

这是Azure的标准错误日志:

OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
/databricks/python/lib/python3.5/site-packages/IPython/config/loader.py:38: UserWarning: IPython.utils.traitlets has moved to a top-level traitlets package.
  from IPython.utils.traitlets import HasTraits, List, Any, TraitError
Fri Jan  4 16:51:08 2019 py4j imported
Fri Jan  4 16:51:08 2019 Python shell started with PID  2543  and guid  86405138b8744987a1df085e4454bb5d
Could not launch process The 'config' trait of an IPythonShell instance must be a Config, but a value of class 'IPython.config.loader.Config' (i.e. {'HistoryManager': {'hist_file': ':memory:'}, 'HistoryAccessor': {'hist_file': ':memory:'}}) was specified. Traceback (most recent call last):
  File "/tmp/1546620668035-0/PythonShell.py", line 1048, in <module>
    launch_process()
  File "/tmp/1546620668035-0/PythonShell.py", line 1036, in launch_process
    console_buffer, error_buffer)
  File "/tmp/1546620668035-0/PythonShell.py", line 508, in __init__
    self.shell = self.create_shell()
  File "/tmp/1546620668035-0/PythonShell.py", line 617, in create_shell
    ip_shell = IPythonShell.instance(config=config, user_ns=user_ns)
  File "/databricks/python/lib/python3.5/site-packages/traitlets/config/configurable.py", line 412, in instance
    inst = cls(*args, **kwargs)
  File "/databricks/python/lib/python3.5/site-packages/IPython/terminal/embed.py", line 159, in __init__
    super(InteractiveShellEmbed,self).__init__(**kw)
  File "/databricks/python/lib/python3.5/site-packages/IPython/terminal/interactiveshell.py", line 455, in __init__
    super(TerminalInteractiveShell, self).__init__(*args, **kwargs)
  File "/databricks/python/lib/python3.5/site-packages/IPython/core/interactiveshell.py", line 622, in __init__
    super(InteractiveShell, self).__init__(**kwargs)
  File "/databricks/python/lib/python3.5/site-packages/traitlets/config/configurable.py", line 84, in __init__
    self.config = config
  File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 583, in __set__
    self.set(obj, value)
  File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 557, in set
    new_value = self._validate(obj, value)
  File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 589, in _validate
    value = self.validate(obj, value)
  File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 1681, in validate
    self.error(obj, value)
  File "/databricks/python/lib/python3.5/site-packages/traitlets/traitlets.py", line 1528, in error
    raise TraitError(e)
traitlets.traitlets.TraitError: The 'config' trait of an IPythonShell instance must be a Config, but a value of class 'IPython.config.loader.Config' (i.e. {'HistoryManager': {'hist_file': ':memory:'}, 'HistoryAccessor': {'hist_file': ':memory:'}}) was specified.

4 个答案:

答案 0 :(得分:1)

在将azureml['notebooks'] Python软件包安装到我们的集群中之后,我和我的团队遇到了这个问题。安装似乎可以正常工作,但是我们收到了试图运行代码单元的“已取消”消息。

我们在日志中也收到与该帖子类似的错误:

The 'config' trait of an IPythonShell instance must be a Config, 
  but a value of class 'IPython.config.loader.Config'...

似乎某些Python软件包可能与此Config对象冲突,或者不兼容。我们卸载了该库,重新启动了集群,一切正常。希望这对某人有帮助:)

答案 1 :(得分:0)

好吧,我最终创建了另一个新集群,现在看来可以了。唯一不同的是,在上一个群集中,我设置了最大节点数,可以将其最大扩展到5。这次,我将其保留为默认值8。

但是我不知道那是否是造成差异的真正原因。 Esp。考虑到昨天的错误是在以前运行良好的群集上。还是今天的错误在于执行非常简单的代码。

答案 2 :(得分:0)

听起来您的群集可能已进入不良状态,需要重新启动。有时基础VM服务也会出现故障,您需要使用新节点启动新集群。如果您无法执行代码,请务必从重新启动群集开始。

答案 3 :(得分:0)

似乎安装的IPython软件包版本有问题。 解决该问题的原因是降级了IPython版本:

集群(左窗格)>单击集群>库> Install New> PyPi>在“包”字段中,输入:“ ipython == 3.2.3”>安装

然后重新启动集群。

此外,Databricks的NumPy软件包似乎也存在另一个类似的问题,该问题在修复IPython之后就发生了。如果您也遇到这种情况,请尝试使用与IPython相同的方法降级为numpy == 1.15.0。