没有名为unusual_prefix_ *的模块

时间:2016-08-30 07:34:01

标签: python pickle airflow

我尝试在Airflow安装中运行Python Operator Example。安装已在同一台计算机上部署了Web服务器,调度程序和工作程序,并且不会对所有非PytohnOperator任务进行投诉。该任务失败,抱怨无法导入模块“unusual_prefix_ *”,其中*是包含DAG的文件的名称。

完整的堆栈跟踪:

['/usr/bin/airflow', 'run', 'tutorialpy', 'print_the_context', '2016-08-23T10:00:00', '--pickle', '90', '--local']
[2016-08-29 11:35:20,054] {__init__.py:36} INFO - Using executor CeleryExecutor
[2016-08-29 11:35:20,175] {configuration.py:508} WARNING - section/key [core/fernet_key] not found in config
Traceback (most recent call last):
  File "/usr/bin/airflow", line 90, in <module>
    args.func(args)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/airflow/bin/cli.py", line 214, in run
    DagPickle).filter(DagPickle.id == args.pickle).first()
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2659, in first
    ret = list(self[0:1])
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2457, in __getitem__
    return list(res)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", line 86, in instances
    util.raise_from_cause(err)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/util/compat.py", line 202, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", line 71, in instances
    rows = [proc(row) for row in fetch]
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", line 428, in _instance
    loaded_instance, populate_existing, populators)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", line 486, in _populate_full
    dict_[key] = getter(row)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/sqlalchemy/sql/sqltypes.py", line 1253, in process
    return loads(value)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1090, in load_global
    klass = self.find_class(module, name)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/dill/dill.py", line 406, in find_class
    return StockUnpickler.find_class(self, module, name)
  File "/usr/lib/python2.7/pickle.py", line 1124, in find_class
    __import__(module)
ImportError: No module named unusual_prefix_tutorialpy
[2016-08-29 11:35:20,498: ERROR/Worker-1] Command 'airflow run tutorialpy print_the_context 2016-08-23T10:00:00 --pickle 90 --local ' returned non-zero exit status 1
[2016-08-29 11:35:20,502: ERROR/MainProcess] Task airflow.executors.celery_executor.execute_command[01152b52-044e-4361-888c-ef2d45983c60] raised unexpected: AirflowException('Celery command failed',)
Traceback (most recent call last):
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/celery/app/trace.py", line 240, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/celery/app/trace.py", line 438, in __protected_call__
    return self.run(*args, **kwargs)
  File "/opt/mf-airflow/airflow/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 45, in execute_command
    raise AirflowException('Celery command failed')
AirflowException: Celery command failed

2 个答案:

答案 0 :(得分:2)

使用donot_pickle=True选项(see Documentation)我摆脱了错误。可能不是最好的解决方案,但对我的情况来说足够好。

答案 1 :(得分:0)

在kubernetnes上运行puckel / docker-airlfow时,我也遇到了这个问题。

trun donot_pickle是非题对我不起作用。

经过一些实验,我通过从用于启动过程的参数airlfow schduler / worker中删除-p参数来摆脱了这个问题

airflow scheduler -n 5 -p-> airflow scheduler -n 5

airflow worker -p-> airflow worker