运行PySpark时出错,无法连接到master

时间:2017-03-26 19:30:21

标签: python-2.7 apache-spark pyspark

您好我有以下python代码:

from __future__ import print_function

import sys

from pyspark.sql import SparkSession

from data import Data

if __name__ == "__main__":
    if len(sys.argv) != 2:
        print("Usage: runner <number_of_executors>", file=sys.stderr)
        exit(-1)

    spark = SparkSession \
        .builder \
        .master("spark://138.xxx.xxx.xxx:7077") \
        .config("spark.num-executors", sys.argv[1]) \
        .config("spark.driver.memory", "1g") \
        .config("spark.executor.memory", "1g") \
        .config("spark.executor.cores", "4") \
        .appName("APP") \
        .getOrCreate()

    data = Data(spark)

    spark.stop()

Data类将加载各种csv文件,但这并不重要。

我在〜/ .bash_profile中添加了以下行:

export SPARK_HOME=/home/tsar/spark
export PATH=$SPARK_HOME/bin:$SPARK_HOME/sbin:$SPARK_HOME/conf:$PATH
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build

我还有以下配置文件:

  • 带有节点列表的奴隶
  • 火花defaults.conf

    spark.master                       spark://138.xxx.xxx.xxx:7077
    spark.driver.memory                1g
    spark.executor.memory              1g
    spark.executor.cores               4
    
  • spark-env.sh

    export SPARK_MASTER_HOST=138.xxx.xxx.xxx
    export SPARK_MASTER_MEMORY=5g
    export SPARK_WORKER_MEMORY=1024m
    

接下来会发生什么:

  • pyspark --master 138.xxx.xxx.xxx:7077
    • 启动pyspark并将其连接到主人
  • spark-submit --num-executors 17 main.py 4
    • 忽略python中的配置,这是意外的,并从conf文件中获取参数,除非被命令行选项覆盖,连接到master并执行代码
  • python main.py 3
    • 我想要使用的选项,无法通过下面的堆栈跟踪连接到master

&#13;
&#13;
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    17/03/26 19:58:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/03/26 19:59:12 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
    17/03/26 19:59:12 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
    17/03/26 19:59:12 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
    17/03/26 19:59:12 ERROR TransportResponseHandler: Still have 3 requests outstanding when connection from /xxx.xxx.xxx.xxx:7077 is closed
    17/03/26 19:59:12 ERROR SparkContext: Error initializing SparkContext.
    java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:224)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:236)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:745)
    Traceback (most recent call last):
      File "main.py", line 21, in <module>
        .appName("CS5052-01 Processing") \
      File "/cs/home/bbt/spark/python/pyspark/sql/session.py", line 169, in getOrCreate
        sc = SparkContext.getOrCreate(sparkConf)
      File "/cs/home/bbt/spark/python/pyspark/context.py", line 307, in getOrCreate
        SparkContext(conf=conf or SparkConf())
      File "/cs/home/bbt/spark/python/pyspark/context.py", line 118, in __init__
        conf, jsc, profiler_cls)
      File "/cs/home/bbt/spark/python/pyspark/context.py", line 179, in _do_init
        self._jsc = jsc or self._initialize_context(self._conf._jconf)
      File "/cs/home/bbt/spark/python/pyspark/context.py", line 246, in _initialize_context
        return self._jvm.JavaSparkContext(jconf)
      File "/cs/home/bbt/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in __call__
      File "/cs/home/bbt/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
    py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
    : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:224)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:236)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:745)
&#13;
&#13;
&#13;

最后的跑步方式与其他方式有何不同? spark master 100%存在于IP地址并且可以访问。

2 个答案:

答案 0 :(得分:2)

我有一个类似的问题,主人和奴隶肯定可以沟通,但当我去找工作时,我得到了同样模糊的错误:

java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

对此的正常解释是master&amp; slave无法通信,但在我的情况下,用户我正在运行spark,因为在slave节点上没有写入日志文件的权限。我通过转到spark主页(默认主站:8080)并从失败的应用程序链接向下钻取到slave的worker stderr输出来找到它。

是否有关于stderr或spark日志中的错误的更多细节(默认/ opt / spark / logs / spark-xxx)?

答案 1 :(得分:0)

我也遇到过类似的情况。就我而言,它是 Spark 版本不匹配。我的 SparkStreaming 在 2.2.1 上运行,而我的 spark master 在 3.0.1 上运行。将两者都带到同一个版本为我解决了这个问题。看到这个问题:Running a python Apache Beam Pipeline on Spark