使用旧的应用程序资源和jar的Spark Job

时间:2016-09-10 16:13:10

标签: java scala apache-spark hortonworks-data-platform

我是新来的火花。尝试使用客户端模式运行spark作业,如果我对jar和其他资源文件使用相同的路径,它就可以正常工作。在使用Yarn命令终止正在运行的应用程序之后,如果使用更新的jar和文件位置重新提交spark作业,则作业仍然使用我的旧路径。重新启动系统后,spark job开辟了新的道路。 Spark-submit命令

spark-submit \
    --class export.streaming.DataExportStreaming \
    --jars /usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar \
    --driver-class-path /usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar  \
    --conf spark.driver.extraClassPath=/usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar \
    --conf spark.executor.extraClassPath=/usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar \
    --master yarn --deploy-mode client \
    --files /usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_daily.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_device.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_workflow.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_workflow_step.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_assignment.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_daily.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_device.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_queue.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_workflow.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_workflow_step.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_user_login_session.sql /usr/lib/firebet-spark/52.0.2-1/data-export/lib/data-export-assembly-52.0.2-1.jar /usr/lib/firebet-spark/52.0.2-1/data-export/resources/application.conf

如何解决这个问题? spark-submit命令是否正确? 生产中哪种部署模式更好的客户端或集群?

0 个答案:

没有答案