在基于Ubuntu的Spark服务器上开始出现这个奇怪的错误:
spark.deploy.master.Master --ip 68.140.243.180 --port 7077 --webui-port 8080
========================================
16/12/07 09:12:09 INFO master.Master: Registered signal handlers for [TERM, HUP, INT]
16/12/07 09:12:09 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/12/07 09:12:09 INFO spark.SecurityManager: Changing view acls to: boescst
16/12/07 09:12:09 INFO spark.SecurityManager: Changing modify acls to: boescst
16/12/07 09:12:09 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(boescst); users with modify permissions: Set(boescst)
16/12/07 09:12:10 WARN util.Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
16/12/07 09:12:10 WARN util.Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
16/12/07 09:12:10 WARN util.Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
..
16/12/07 09:12:10 WARN util.Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
16/12/07 09:12:10 WARN util.Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkMaster' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
请注意,在O / S级别绑定到这些端口没有固有问题。为了证明相同,我写了一个小TestBind
程序:
object TestBind {
def main(args: Array[String]) = {
import java.net._
val ssock = new ServerSocket()
ssock.bind(new InetSocketAddress(InetAddress.getLocalHost.getHostName, 7077))
println(s"Connected to $ssock")
while (true) {
ssock.accept
println("got another request")
}
}
}
让我们运行它:
$scala TestBind
Connected to ServerSocket[addr=TCA0080ALKVTAGB/192.168.0.3,localport=7077]
现在通过telnet
测试:
$telnet $(hostname) 7077
Trying 192.168.0.3...
Connected to tca0080alkvtagb.
Escape character is '^]'.
在服务器上:
got another request
因此服务器上的bind
没有任何问题。那么Spark有什么用?