尝试将pyspark与Livy一起使用时,出现PYSPARK_GATEWAY_SECRET错误

时间:2018-09-12 14:00:22

标签: pyspark livy

使用pyspark在命令行上启动pyspark时,一切都会按预期进行。但是,使用Livy时不会。

我使用邮递员建立了联系。首先,我将此发布到sessions端点:

{
  "kind": "pyspark",
  "proxyUser": "spark"
}

会话开始,我可以看到Spark在YARN上入门。但是,我的容器日志中出现此错误:

18/09/12 15:53:00 ERROR repl.PythonInterpreter: Process has died with 1
18/09/12 15:53:00 ERROR repl.PythonInterpreter: Traceback (most recent call last):
  File "/yarn/nm/usercache/livy/appcache/application_1535188013308_0051/container_1535188013308_0051_01_000001/tmp/3015653701235928503", line 643, in <module>
    sys.exit(main())
  File "/yarn/nm/usercache/livy/appcache/application_1535188013308_0051/container_1535188013308_0051_01_000001/tmp/3015653701235928503", line 533, in main
    exec('from pyspark.shell import sc', global_dict)
  File "<string>", line 1, in <module>
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/shell.py", line 38, in <module>
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/context.py", line 292, in _ensure_initialized
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/java_gateway.py", line 47, in launch_gateway
  File "/usr/lib64/python2.7/UserDict.py", line 23, in __getitem__
    raise KeyError(key)
KeyError: 'PYSPARK_GATEWAY_SECRET'

sessions/XYZ/log的输出是:

{
    "id": 16,
    "from": 0,
    "total": 46,
    "log": [
        "stdout: ",
        "\nstderr: ",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/livy-api-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/livy-rsc-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/netty-all-4.0.29.Final.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/commons-codec-1.9.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/livy-core_2.11-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/livy-repl_2.11-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/lama/lama.main-assembly-0.9.0-spark2.3.0-hadoop2.6.5-SNAPSHOT.jar.",
        "18/09/12 15:52:50 INFO client.RMProxy: Connecting to ResourceManager at master1.lama.nuc/192.168.42.100:8032",
        "18/09/12 15:52:51 INFO yarn.Client: Requesting a new application from cluster with 6 NodeManagers",
        "18/09/12 15:52:51 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (12288 MB per container)",
        "18/09/12 15:52:51 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead",
        "18/09/12 15:52:51 INFO yarn.Client: Setting up container launch context for our AM",
        "18/09/12 15:52:51 INFO yarn.Client: Setting up the launch environment for our AM container",
        "18/09/12 15:52:51 INFO yarn.Client: Preparing resources for our AM container",
        "18/09/12 15:52:51 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/livy-api-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/livy-rsc-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/netty-all-4.0.29.Final.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/commons-codec-1.9.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/livy-core_2.11-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/livy-repl_2.11-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/lama/lama.main-assembly-0.9.0-spark2.3.0-hadoop2.6.5-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Uploading resource file:/tmp/spark-37413ebc-9427-44d8-8a01-c4222eb899f8/__spark_conf__7516701035111969209.zip -> hdfs://master1.lama.nuc:8020/user/livy/.sparkStaging/application_1535188013308_0051/__spark_conf__.zip",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing view acls to: livy",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing modify acls to: livy",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing view acls groups to: ",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing modify acls groups to: ",
        "18/09/12 15:52:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(livy); groups with view permissions: Set(); users  with modify permissions: Set(livy); groups with modify permissions: Set()",
        "18/09/12 15:52:57 INFO yarn.Client: Submitting application application_1535188013308_0051 to ResourceManager",
        "18/09/12 15:52:57 INFO impl.YarnClientImpl: Submitted application application_1535188013308_0051",
        "18/09/12 15:52:57 INFO yarn.Client: Application report for application_1535188013308_0051 (state: ACCEPTED)",
        "18/09/12 15:52:57 INFO yarn.Client: ",
        "\t client token: N/A",
        "\t diagnostics: N/A",
        "\t ApplicationMaster host: N/A",
        "\t ApplicationMaster RPC port: -1",
        "\t queue: root.users.livy",
        "\t start time: 1536760377659",
        "\t final status: UNDEFINED",
        "\t tracking URL: http://master1.lama.nuc:8088/proxy/application_1535188013308_0051/",
        "\t user: livy",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Shutdown hook called",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-795d9b05-a5ad-4930-ad8b-77034022bc17",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-37413ebc-9427-44d8-8a01-c4222eb899f8",
        "\nYARN Diagnostics: "
    ]
}

这是怎么了?将CDH 5.15.0与Parcels和Spark2结合使用。使用Scala可以正常工作。

跟进

我将部署模式从cluster设置为client。 KeyError消失了,但是当尝试运行一个简单的sc.version时,我得到的Interpreter died没有任何回溯或错误。

1 个答案:

答案 0 :(得分:0)

我遇到了同样的问题,并通过升级到Livy 0.5.0解决了它。

显然,CDH 5.15.0修复了一个安全漏洞(CVE-2018-1334),该漏洞引入了与Livy <0.5.0的不兼容性。感谢Marcelo Vanzin将其发布在livy-user mailing list archives中。