spark submit" Service' Driver'无法绑定端口"错误

时间:2016-07-18 05:45:25

标签: apache-spark word-count

我使用以下命令来运行wordcount的spark java示例: -

time spark-submit --deploy-mode cluster --master spark://192.168.0.7:6066 --class org.apache.spark.examples.JavaWordCount /home/pi/Desktop/example/new/target/javaword.jar /books_50.txt 

当我运行它时,输出如下: -

Running Spark using the REST application submission protocol.
16/07/18 03:55:41 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submission successfully created as driver-20160718035543-0000. Polling submission state...
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submitting a request for the status of submission driver-20160718035543-0000 in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: State of driver driver-20160718035543-0000 is now RUNNING.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Driver is running on worker worker-20160718041005-192.168.0.12-42405 at 192.168.0.12:42405.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Server responded with CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160718035543-0000",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160718035543-0000",
  "success" : true
}

我检查了特定工作人员(192.168.0.12)的日志,并说: -

Launch Command: "/usr/lib/jvm/jdk-8-oracle-arm32-vfp-hflt/jre/bin/java" "-cp" "/opt/spark/conf/:/opt/spark/lib/spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/spark/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark/lib/datanucleus-core-3.2.10.jar:/opt/spark/lib/datanucleus-rdbms-3.2.9.jar" "-Xms1024M" "-Xmx1024M" "-Dspark.driver.supervise=false" "-Dspark.app.name=org.apache.spark.examples.JavaWordCount" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/home/pi/Desktop/example/new/target/javaword.jar" "-Dspark.master=spark://192.168.0.7:7077" "-Dspark.executor.memory=10M" "org.apache.spark.deploy.worker.DriverWrapper" "spark://Worker@192.168.0.12:42405" "/opt/spark/work/driver-20160718035543-0000/javaword.jar" "org.apache.spark.examples.JavaWordCount" "/books_50.txt"
========================================

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/18 04:10:58 INFO SecurityManager: Changing view acls to: pi
16/07/18 04:10:58 INFO SecurityManager: Changing modify acls to: pi
16/07/18 04:10:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pi); users with modify permissions: Set(pi)
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'Driver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'Driver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

我的spark-env.sh文件(对于master)包含: -

export SPARK_MASTER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"

我的spark-env.sh文件(for worker)包含: -

export SPARK_WORKER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"

请帮忙...... !!

10 个答案:

答案 0 :(得分:12)

尝试运行shell时遇到了同样的问题,并且能够通过设置SPARK_LOCAL_IP环境变量来实现此功能。您可以在运行shell时从命令行分配:

SPARK_LOCAL_IP=127.0.0.1 ./bin/spark-shell

要获得更持久的解决方案,请在Spark根目录的conf目录中创建spark-env.sh文件。添加以下行:

SPARK_LOCAL_IP=127.0.0.1

使用chmod +x ./conf/spark-env.sh为脚本授予执行权限,默认情况下会设置此环境变量。

答案 1 :(得分:7)

我正在使用Maven / SBT来管理依赖项,并且Spark核心包含在jar文件中。

您可以通过设置" spark.driver.bindAddress"在运行时覆盖SPARK_LOCAL_IP。 (这里是斯卡拉):

val config = new SparkConf()
config.setMaster("local[*]")
config.setAppName("Test App")
config.set("spark.driver.bindAddress", "127.0.0.1")
val sc = new SparkContext(config)

答案 2 :(得分:2)

我也有这个问题。

原因(对我而言)是我的本地系统的IP无法从我的本地系统访问。 我知道这句话毫无意义,但请阅读以下内容。

我的系统名称(uname -s)显示我的系统名为" sparkmaster"。 在我的/ etc / hosts文件中,我为sparkmaster系统分配了一个固定的IP地址为" 192.168.1.70"。 sparknode01和sparknode02还有额外的固定IP地址...... 1.71& ......分别为1.72。

由于我遇到的其他一些问题,我需要将所有网络适配器更改为DHCP。这意味着他们获得的地址如192.168.90.123。 DHCP地址与... 1.70范围不在同一网络中,并且没有配置路由。

当spark开始时,似乎想要尝试连接到以uname命名的主机(即我的情况下是sparkmaster)。这是IP 192.168.1.70 - 但没有办法连接到那个,因为该地址位于无法访问的网络中。

我的解决方案是将我的一个以太网适配器更改回固定的静态地址(即192.168.1.70)并解决问题。

所以问题似乎是当火花开始于"本地模式"它尝试连接到以系统名称命名的系统(而不是本地主机)。 我想这是有道理的,如果你想设置一个集群(就像我做的那样),但它可能导致上述令人困惑的消息。 可能将您系统的主机名放在/ etc / hosts中的127.0.0.1条目上也可以解决这个问题,但我没有尝试过。

答案 3 :(得分:1)

您需要在/etc/hosts文件中输入主机名。 类似的东西:

127.0.0.1   localhost "hostname"

答案 4 :(得分:0)

这可能是Spark 1.2.1 standalone cluster mode spark-submit is not working

的副本

我尝试过相同的步骤,但能够运行这项工作。如果可能,请发布完整的spark-env.sh和spark-defaults。

答案 5 :(得分:0)

我遇到了这个问题,这是因为使用/ etc / hosts中的IP更改了真实IP。

答案 6 :(得分:0)

此问题仅与IP地址有关。日志文件中的错误消息不提供信息。 请按照以下3个步骤进行检查:

  1. 检查您的IP地址-可以使用ifconfig或ip命令检查。如果您的服务不是公共服务。具有192.168的IP地址应该足够好。如果您正在计划群集,则不能使用127.0.0.1。

  2. 检查环境变量SPARK_MASTER_HOST-检查变量名称或实际IP地址中是否没有错字。

    env | grep SPARK _

  3. 使用命令netstat可以免费检查计划用于sparkMaster的端口。不要使用1024以下的端口。例如:

    netstat -a | 9123

如果您无法从另一台机器上看到webui,则sparkmaster开始运行后,请使用命令iptables打开webui端口。

答案 7 :(得分:0)

在数据框中的使用方式如下

val spark = SparkSession.builder.appName(“ BinarizerExample”)。master(“ local [*]”)。config(“ spark.driver.bindAddress”,“ 127.0.0.1”)。getOrCreate()

答案 8 :(得分:0)

第一个选项:-

以下步骤可能会有所帮助:

Get your hostname by using "hostname" command.

 xxxxxx.ssssss  (e) base  ~  hostname
 xxxxxx.ssssss.net

如果您的主机名不存在,请在 /etc/hosts 文件中输入一个条目,如下所示:

127.0.0.1      xxxxxx.ssssss.net

第二个选项:-

您可以在 spark.conf 文件中设置 spark.driver.bindAddress

spark.driver.bindAddress=127.0.0.1

谢谢!!

答案 9 :(得分:-2)

我通过修改从属文件解决了这个问题。它的spark-2.4.0-bin-hadoop2.7 / conf / slave 请检查您的配置。