Airflow ModuleNotFoundError:没有名为“ pyspark”的模块

时间:2019-09-20 08:43:03

标签: apache-spark pyspark airflow

我在机器上安装了Airflow,它运转良好,并且也有本地火花(也可以运行)。 我想利用气流编排两个火花任务:task_spark_datatransform >> task_spark_model_reco。 与这两个任务相关联的两个pyspark模块已经过测试,并可以在Spark下正常运行。

我还使用bashOperator *创建了一个非常简单的Airflow Dag,以运行每个spark任务。例如,对于任务task_spark_datatransform,我有:

task_spark_datatransform = BashOperator (task_id = 'task_spark_datatransform', bash_command = spark_home + 'spark-submit --master local [*]' + srcDir + 'dataprep.py'),
where, in my case, spark_home = '/usr/bin/spark/bin/'

*如同一主题的多篇严肃教程中所述。

问题:为什么Airflow无法识别pyspark?

日志:

[2019-09-20 10:21:21 +0200] [5945] [INFO] Worker exiting (pid: 5945)
[2019-09-20 10:21:51 +0200] [5554] [INFO] Handling signal: ttin
[2019-09-20 10:21:51 +0200] [6128] [INFO] Booting worker with pid: 6128
[2019-09-20 10:21:51,609] {__init__.py:51} INFO - Using executor SequentialExecutor
[2019-09-20 10:21:52,021] {__init__.py:305} INFO - Filling up the DagBag from /home/ach/airflow/dags
[2019-09-20 10:21:52,026] {__init__.py:416} ERROR - Failed to import: /home/ach/airflow/dags/spark_af.py
Traceback (most recent call last):
  File "/home/ach/airflow/lib/python3.7/site-packages/airflow/models/__init__.py", line 413, in process_file
    m = imp.load_source(mod_name, filepath)
  File "/home/ach/airflow/lib/python3.7/imp.py", line 171, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 696, in _load
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/ach/airflow/dags/spark_af.py", line 3, in <module>
    import dataprep
  File "/home/ach/airflow/dags/dataprep.py", line 2, in <module>
    from pyspark.sql import SparkSession
ModuleNotFoundError: No module named 'pyspark'

1 个答案:

答案 0 :(得分:1)

好像您缺少pyspark

运行以下命令:

pip install pyspark