SPARK - 连接到主站时出现从站错误(未知消息类型:-84)

时间:2016-09-14 11:57:02

标签: r apache-spark

我正试图在R中使用Spark。

我已在主服务器和从服务器上安装了2.0.0版。

以下是我如何从R启动Spark(在Windows 2012中)。然后,我可以访问http://10.1.3.2:4040/executors/并进行一些计算。

library(SparkR, lib.loc = c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib")))

library(SparkR)
library(rJava)

sparkR.session(enableHiveSupport = FALSE, 
               master = "local[*]", 
               sparkConfig = list(spark.driver.memory = "2g"),
               spark.sql.warehouse.dir="C:\\hadoop-2.7.3\\bin")

在奴隶(CentOS 7.0)上,我刚刚解压缩了这个文件:http://d3kbcqa49mib13.cloudfront.net/spark-2.0.0-bin-hadoop2.7.tgz

没有其他配置我启动: ./sbin/start-slave.sh 10.1.3.2:49198

我在日志文件中出现以下错误:

Spark Command: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.el7_2.x86_64/jre/bin/java -cp /home/cloud/spark-2.0.0-bin-hadoop2.7/conf/:/home/cloud/spark-2.0.0-bin-hadoop2.7/jars/* -Xmx1g org.apache.spark.deploy.worker.Worker --webui-po
rt 8081 10.1.3.2:49198
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/14 11:30:09 INFO Worker: Started daemon with process name: 2164@calc-test-sed-slave
16/09/14 11:30:09 INFO SignalUtils: Registered signal handler for TERM
16/09/14 11:30:09 INFO SignalUtils: Registered signal handler for HUP
16/09/14 11:30:09 INFO SignalUtils: Registered signal handler for INT
16/09/14 11:30:09 WARN Utils: Your hostname, calc-test-sed-slave resolves to a loopback address: 127.0.0.1; using 10.1.3.3 instead (on interface eth0)
16/09/14 11:30:09 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/09/14 11:30:10 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/14 11:30:10 INFO SecurityManager: Changing view acls to: root
16/09/14 11:30:10 INFO SecurityManager: Changing modify acls to: root
16/09/14 11:30:10 INFO SecurityManager: Changing view acls groups to:
16/09/14 11:30:10 INFO SecurityManager: Changing modify acls groups to:
16/09/14 11:30:10 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modi
fy permissions: Set()
16/09/14 11:30:11 INFO Utils: Successfully started service 'sparkWorker' on port 58184.
16/09/14 11:30:12 INFO Worker: Starting Spark worker 10.1.3.3:58184 with 1 cores, 1024.0 MB RAM
16/09/14 11:30:12 INFO Worker: Running Spark version 2.0.0
16/09/14 11:30:12 INFO Worker: Spark home: /home/cloud/spark-2.0.0-bin-hadoop2.7
16/09/14 11:30:12 INFO Utils: Successfully started service 'WorkerUI' on port 8081.
16/09/14 11:30:12 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://10.1.3.3:8081
16/09/14 11:30:12 INFO Worker: Connecting to master 10.1.3.2:49198...
16/09/14 11:30:12 INFO TransportClientFactory: Successfully created connection to /10.1.3.2:49198 after 68 ms (0 ms spent in bootstraps)
16/09/14 11:30:13 WARN Worker: Failed to connect to master 10.1.3.2:49198
org.apache.spark.SparkException: Exception thrown in awaitResult
        at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
        at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
        at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
        at org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deploy$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:216)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown message type: -84
        at org.apache.spark.network.shuffle.protocol.BlockTransferMessage$Decoder.fromByteBuffer(BlockTransferMessage.java:70)
        at org.apache.spark.network.netty.NettyBlockRpcServer.receive(NettyBlockRpcServer.scala:54)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:158)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:106)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
        at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Unknown Source)

0 个答案:

没有答案