spark commit-[主机节点]

时间:2018-12-27 10:57:38

标签: scala apache-spark spark-submit apache-spark-standalone

我在Windows 7计算机上本地设置了一个火花群集。它具有一个主节点和一个工作节点。我使用sbt compile + sbt包创建了一个简单的jar,并尝试使用spark-submit将其提交到spark主节点。当前,主服务器和工作服务器都在同一台计算机上,如果可以工作,则计划是在多台计算机上部署本地集群。最终,所有这些都将在Azure上执行。

主节点

Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation.  All rights reserved.

C:\Windows\system32>spark-class org.apache.spark.deploy.master.Master
2018-12-26 20:00:45 INFO  Master:2612 - Started daemon with process name: 13968@ws-amalhotra
2018-12-26 20:00:45 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-12-26 20:00:45 INFO  SecurityManager:54 - Changing view acls to: admin
2018-12-26 20:00:45 INFO  SecurityManager:54 - Changing modify acls to: admin
2018-12-26 20:00:45 INFO  SecurityManager:54 - Changing view acls groups to:
2018-12-26 20:00:45 INFO  SecurityManager:54 - Changing modify acls groups to:
2018-12-26 20:00:45 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(admin); groups with view permissions: Set(); user
s  with modify permissions: Set(admin); groups with modify permissions: Set()
2018-12-26 20:00:46 INFO  Utils:54 - Successfully started service 'sparkMaster' on port 7077.
2018-12-26 20:00:46 INFO  Master:54 - Starting Spark master at spark://192.168.8.101:7077
2018-12-26 20:00:46 INFO  Master:54 - Running Spark version 2.3.2
2018-12-26 20:00:46 INFO  log:192 - Logging initialized @1268ms
2018-12-26 20:00:46 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2018-12-26 20:00:46 INFO  Server:419 - Started @1334ms
2018-12-26 20:00:46 INFO  AbstractConnector:278 - Started ServerConnector@16391414{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
2018-12-26 20:00:46 INFO  Utils:54 - Successfully started service 'MasterUI' on port 8080.
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@204e3825{/app,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@748394e8{/app/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19b99890{/,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5c0f561c{/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3443bda1{/static,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54541f46{/app/kill,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6e8c3d12{/driver/kill,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  MasterWebUI:54 - Bound MasterWebUI to 0.0.0.0, and started at http://ws-amalhotra.ivp.co.in:8080
2018-12-26 20:00:46 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22eb9260{/,null,AVAILABLE}
2018-12-26 20:00:46 INFO  AbstractConnector:278 - Started ServerConnector@7e6d0324{HTTP/1.1,[http/1.1]}{192.168.8.101:6066}
2018-12-26 20:00:46 INFO  Server:419 - Started @1394ms
2018-12-26 20:00:46 INFO  Utils:54 - Successfully started service on port 6066.
2018-12-26 20:00:46 INFO  StandaloneRestServer:54 - Started REST server for submitting applications on port 6066
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1a4c3e84{/metrics/master/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a3b4746{/metrics/applications/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:46 INFO  Master:54 - I have been elected leader! New state: ALIVE
2018-12-26 20:00:54 INFO  Master:54 - Registering worker 192.168.8.101:8089 with 8 cores, 14.9 GB RAM
2018-12-26 20:01:20 INFO  Master:54 - Driver submitted org.apache.spark.deploy.worker.DriverWrapper
2018-12-26 20:01:20 INFO  Master:54 - Launching driver driver-20181226200120-0000 on worker worker-20181226200053-192.168.8.101-8089
2018-12-26 20:01:22 INFO  Master:54 - Removing driver: driver-20181226200120-0000
2018-12-26 20:01:25 WARN  TransportChannelHandler:78 - Exception in connection from /192.168.8.101:63501
java.io.IOException: An existing connection was forcibly closed by the remote host
        at sun.nio.ch.SocketDispatcher.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:343)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
        at java.lang.Thread.run(Thread.java:748)
2018-12-26 20:01:25 WARN  TransportChannelHandler:78 - Exception in connection from /192.168.8.101:63557
java.io.IOException: An existing connection was forcibly closed by the remote host
        at sun.nio.ch.SocketDispatcher.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:343)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
        at java.lang.Thread.run(Thread.java:748)
2018-12-26 20:01:25 INFO  Master:54 - 192.168.8.101:63501 got disassociated, removing it.
2018-12-26 20:01:25 INFO  Master:54 - 192.168.8.101:63557 got disassociated, removing it.
2018-12-26 20:01:25 INFO  Master:54 - 192.168.8.101:63556 got disassociated, removing it.

工作节点

Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation.  All rights reserved.

C:\Windows\system32>spark-class org.apache.spark.deploy.worker.Worker spark://192.168.8.101:7077 -p 8089
2018-12-26 20:00:53 INFO  Worker:2612 - Started daemon with process name: 13960@ws-amalhotra
2018-12-26 20:00:53 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-12-26 20:00:53 INFO  SecurityManager:54 - Changing view acls to: admin
2018-12-26 20:00:53 INFO  SecurityManager:54 - Changing modify acls to: admin
2018-12-26 20:00:53 INFO  SecurityManager:54 - Changing view acls groups to:
2018-12-26 20:00:53 INFO  SecurityManager:54 - Changing modify acls groups to:
2018-12-26 20:00:53 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(admin); groups with view permissions: Set(); user
s  with modify permissions: Set(admin); groups with modify permissions: Set()
2018-12-26 20:00:53 INFO  Utils:54 - Successfully started service 'sparkWorker' on port 8089.
2018-12-26 20:00:54 INFO  Worker:54 - Starting Spark worker 192.168.8.101:8089 with 8 cores, 14.9 GB RAM
2018-12-26 20:00:54 INFO  Worker:54 - Running Spark version 2.3.2
2018-12-26 20:00:54 INFO  Worker:54 - Spark home: C:\spark
2018-12-26 20:00:54 INFO  log:192 - Logging initialized @1367ms
2018-12-26 20:00:54 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2018-12-26 20:00:54 INFO  Server:419 - Started @1411ms
2018-12-26 20:00:54 INFO  AbstractConnector:278 - Started ServerConnector@319b7858{HTTP/1.1,[http/1.1]}{0.0.0.0:8081}
2018-12-26 20:00:54 INFO  Utils:54 - Successfully started service 'WorkerUI' on port 8081.
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@32b3c348{/logPage,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3db48501{/logPage/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2606a6f4{/,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6cf2a898{/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@75f7a48a{/static,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@563f35a1{/log,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  WorkerWebUI:54 - Bound WorkerWebUI to 0.0.0.0, and started at http://ws-amalhotra.ivp.co.in:8081
2018-12-26 20:00:54 INFO  Worker:54 - Connecting to master 192.168.8.101:7077...
2018-12-26 20:00:54 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4b2bed3e{/metrics/json,null,AVAILABLE,@Spark}
2018-12-26 20:00:54 INFO  TransportClientFactory:267 - Successfully created connection to /192.168.8.101:7077 after 28 ms (0 ms spent in bootstraps)
2018-12-26 20:00:54 INFO  Worker:54 - Successfully registered with master spark://192.168.8.101:7077
2018-12-26 20:01:20 INFO  Worker:54 - Asked to launch driver driver-20181226200120-0000
2018-12-26 20:01:20 INFO  DriverRunner:54 - Copying user jar file:/D:/_Work/azurepoc/sbtexample/target/scala-2.12/sbtexample_2.12-0.1.0-SNAPSHOT.jar to C:\spark\work\driver-20181226200120-00
00\sbtexample_2.12-0.1.0-SNAPSHOT.jar
2018-12-26 20:01:20 INFO  Utils:54 - Copying D:\_Work\azurepoc\sbtexample\target\scala-2.12\sbtexample_2.12-0.1.0-SNAPSHOT.jar to C:\spark\work\driver-20181226200120-0000\sbtexample_2.12-0.1
.0-SNAPSHOT.jar
2018-12-26 20:01:20 INFO  DriverRunner:54 - Launch Command: "C:\Program Files\Java\jdk1.8.0_181\bin\java" "-cp" "C:\spark\bin\..\conf\;C:\spark\jars\*" "-Xmx1024M" "-Dspark.master=spark://19
2.168.8.101:7077" "-Dspark.driver.supervise=false" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/D:/_Work/azurepoc/sbtexample/target/scala-2.12/sbtexample_2.12-0.1.0-SNAPSHOT.jar"
"-Dspark.rpc.askTimeout=10s" "-Dspark.app.name=example1.HelloWorld" "org.apache.spark.deploy.worker.DriverWrapper" "spark://Worker@192.168.8.101:8089" "C:\spark\work\driver-20181226200120-00
00\sbtexample_2.12-0.1.0-SNAPSHOT.jar" "example1.HelloWorld"
2018-12-26 20:01:22 INFO  Worker:54 - Driver driver-20181226200120-0000 exited successfully

火花提交代码

C:\Users\amalhotra>spark-submit  --deploy-mode cluster --master spark://192.168.
8.101:7077 --class "example1.HelloWorld"   "D:\_Work\azurepoc\sbtexample\target\
scala-2.12\sbtexample_2.12-0.1.0-SNAPSHOT.jar"
Running Spark using the REST application submission protocol.
2018-12-26 20:01:09 INFO  RestSubmissionClient:54 - Submitting a request to laun
ch an application in spark://192.168.8.101:7077.
2018-12-26 20:01:19 WARN  RestSubmissionClient:66 - Unable to connect to server
spark://192.168.8.101:7077.
Warning: Master endpoint spark://192.168.8.101:7077 was not a REST server. Falli
ng back to legacy submission gateway instead.
2018-12-26 20:01:19 WARN  NativeCodeLoader:62 - Unable to load native-hadoop lib
rary for your platform... using builtin-java classes where applicable

C:\Users\amalhotra>spark-submit  --deploy-mode cluster --master spark://192.168.
8.101:7077 --class "example1.HelloWorld"   "D:\_Work\azurepoc\sbtexample\target\
scala-2.12\sbtexample_2.12-0.1.0-SNAPSHOT.jar"

邮政编码

package example1

import java.io._



object HelloWorld {
    def main(args: Array[String]): Unit = {
         println("===============================================")
         println("===============================================") 
        println("Hello, world!")
        println("===============================================")
        println("===============================================")
    }
}

我在主节点上遇到错误:

2018-12-26 20:01:25 WARN  TransportChannelHandler:78 - Exception in connection from /192.168.8.101:63501
java.io.IOException: An existing connection was forcibly closed by the remote host
        at sun.nio.ch.SocketDispatcher.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:343)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
        at java.lang.Thread.run(Thread.java:748)
2018-12-26 20:01:25 WARN  TransportChannelHandler:78 - Exception in connection from /192.168.8.101:63557
java.io.IOException: An existing connection was forcibly closed by the remote host
        at sun.nio.ch.SocketDispatcher.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1106)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:343)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
        at java.lang.Thread.run(Thread.java:748)

我保证的事情:

  • 这不是端口相关的问题,因为我已经完全禁用了防病毒和防火墙规则。
  • 我也将整个内容复制到了我的个人计算机上(以确保它不是与基础结构有关的问题)
  • 事件查看器中没有任何内容

编辑: 非常感谢Sc0rpion的提示。那真是我的愚蠢。我试图将作业提交到Spark主URL。根据Sc0rpion共享的链接,它将被提交到REST端点。在阅读文档时,我一定错过了。

另外,我现在有一点担心。我已经通过从程序中写入文本文件来确保我的工作正在运行,但是当我向以下位置提交工作时,我看不到任何内容打印在主或工作人员的控制台上:火花大师。

将作业提交给母版的代码:

C:\Users\amalhotra>spark-submit  --deploy-mode cluster --master spark://192.168.
8.101:6066 --class "example1.HelloWorld"   "D:\_Work\azurepoc\sbtexample\target\
scala-2.12\sbtexample_2.12-0.1.0-SNAPSHOT.jar"

主人的反应如下:

2018-12-28 13:54:25 INFO  Master:54 - Driver submitted org.apache.spark.deploy.worker.DriverWrapper
2018-12-28 13:54:25 INFO  Master:54 - Launching driver driver-20181228135425-0002 on worker worker-20181228134824-192.168.8.101-8089
2018-12-28 13:54:27 INFO  Master:54 - Removing driver: driver-20181228135425-0002

工人反应:

2018-12-28 13:54:25 INFO  Worker:54 - Asked to launch driver driver-20181228135425-0002
2018-12-28 13:54:25 INFO  DriverRunner:54 - Copying user jar file:/D:/_Work/azurepoc/sbtexample/target/scala-2.12/sbtexample_2.12-0.1.0-SNAPSHOT.jar to C:\spark\work\driver-20181228135425-00
02\sbtexample_2.12-0.1.0-SNAPSHOT.jar
2018-12-28 13:54:25 INFO  Utils:54 - Copying D:\_Work\azurepoc\sbtexample\target\scala-2.12\sbtexample_2.12-0.1.0-SNAPSHOT.jar to C:\spark\work\driver-20181228135425-0002\sbtexample_2.12-0.1
.0-SNAPSHOT.jar
2018-12-28 13:54:25 INFO  DriverRunner:54 - Launch Command: "C:\Program Files\Java\jdk1.8.0_181\bin\java" "-cp" "C:\spark\bin\..\conf\;C:\spark\jars\*" "-Xmx1024M" "-Dspark.master=spark://19
2.168.8.101:7077" "-Dspark.driver.supervise=false" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/D:/_Work/azurepoc/sbtexample/target/scala-2.12/sbtexample_2.12-0.1.0-SNAPSHOT.jar"
"-Dspark.app.name=example1.HelloWorld" "org.apache.spark.deploy.worker.DriverWrapper" "spark://Worker@192.168.8.101:8089" "C:\spark\work\driver-20181228135425-0002\sbtexample_2.12-0.1.0-SNAP
SHOT.jar" "example1.HelloWorld"
2018-12-28 13:54:27 INFO  Worker:54 - Driver driver-20181228135425-0002 exited successfully

但是在任何地方都看不到应用程序的输出。它应该输出如下:

enter image description here

1 个答案:

答案 0 :(得分:2)

如链接所指向-Unable to submit jobs to spark cluster (cluster-mode)

spark-submit-部署模式群集--master spark://192.168.8.101: 6066 --class“ example1.HelloWorld”“ D:_Work \ azurepoc \ sbtexample \ target \ scala-2.12 \ sbtexample_2.12-0.1.0-SNAPSHOT.jar“

代替端口7077