Livy总是以本地模式运行

时间:2018-01-02 13:40:08

标签: apache-spark cloudera livy

我正试图通过Livy服务器以“spark.master = yarn”运行Pyspark(或Spark)作业。

我做了什么:

1)在 spark-defaults.conf

spark.master yarn
spark.submit.deployMode client

2)在 livy.conf

livy.spark.master = yarn
livy.spark.deployMode = client

3)我通过CURL发送请求“conf”:{“spark.master”:“yarn”}

示例:

  curl -X POST -H "Content-Type: application/json"  localhost:8998/batches --data '{"file": "hdfs:///user/grzegorz/hello-world.py", "name": "MY", "conf": {"spark.master": "yarn"} }'
{"id":3,"state":"running","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout: ","\nstderr: "]}

我在日志中总是得到的东西:

18/01/02 14:45:07.880 qtp1758624236-28 INFO BatchSession $:创建批处理会话3:[owner:null,request:[proxyUser:None,file:hdfs:/// user / grzegorz / hello- world.py,name:MY,conf: spark.master - >纱]]

18/01/02 14:45:07.883 qtp1758624236-28 INFO SparkProcessBuilder:运行'/usr/local/share/spark/spark-2.0.2/bin/spark-submit'' - name''MY' '--conf'' spark.master = local ''hdfs:///user/grzegorz/hello-world.py'

我希望有人有任何想法如何解决它。提前谢谢。

0 个答案:

没有答案