使用Hue提交包含Spark-Action的工作流时,出现导入错误。
回溯如下:
2019-01-30 16:31:48,048 [main] INFO org.apache.spark.deploy.yarn.ApplicationMaster - Waiting for spark context initialization...
Traceback (most recent call last):
File "mover.py", line 7, in <module>
import happybase
ImportError: No module named happybase
2019-01-30 16:31:48,169 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User application exited with status 1
在我的集群中,我具有一个带有所有依赖项的Python virtualenv环境,我的集群是使用Cloudera的Spark指令进行配置的,此处为:https://www.cloudera.com/documentation/enterprise/latest/topics/spark_python.html
当我在控制台中使用spark-submit命令时,我可以运行我的应用程序而不会出现任何问题。问题只在我使用Hue时出现。
经过研究,我发现了这篇文章http://www.learn4master.com/big-data/pyspark/run-pyspark-on-oozie,并且我尝试做同样的事情没有成功。
我的Hue生成的工作流程代码是:
<workflow-app name="Copy by hour" xmlns="uri:oozie:workflow:0.5">
<start to="spark-c88a"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="spark-c88a" retry-max="1" retry-interval="1">
<spark xmlns="uri:oozie:spark-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>spark.executorEnv.PYSPARK_PYTHON</name>
<value>/opt/env_cluster/bin/python2</value>
</property>
<property>
<name>spark.yarn.appMasterEnv.PYSPARK_PYTHON</name>
<value>/opt/env_cluster/bin/python2</value>
</property>
</configuration>
<master>yarn</master>
<mode>cluster</mode>
<name>landing_to_daily</name>
<jar>mover.py</jar>
<arg>1</arg>
<arg>-s</arg>
<arg>eir_landing</arg>
<arg>-d</arg>
<arg>eir_daily</arg>
<file>/user/spark/eir/apps/mover.py#mover.py</file>
</spark>
<ok to="End"/>
<error to="email-77d4"/>
</action>
<action name="email-77d4">
<email xmlns="uri:oozie:email-action:0.2">
<to>prueba@mail.com</to>
<subject>Error | Copy by hour</subject>
<body>Error in Workflow landing to daily </body>
<content_type>text/plain</content_type>
</email>
<ok to="Kill"/>
<error to="Kill"/>
</action>
<end name="End"/>
</workflow-app>
答案 0 :(得分:0)
在Cloudera支持的帮助下,我通过以下方式解决了这个问题:
将以下内容添加到火花选项:
-conf spark.yarn.appMasterEnv.PYSPARK_DRIVER_PYTHON = path_to_venv --conf spark.yarn.appMasterEnv.PYSPARK_PYTHON = path_to_venv
2,Spark Launcher也需要设置此环境变量,因此请将其设置为作业属性
<property>
<name>oozie.launcher.mapred.child.env</name>
<value>PYSPARK_PYTHON=path_to_venv</value>
</property>