运行spark-shell时出错:ERROR Remoting:远程处理错误:[启动失败]

时间:2014-10-10 18:38:10

标签: apache-spark

我是Apache Spark的新手。我尝试按照以下指南在Windows 7上使用scala 2.10.4安装Apache spark 1.0.2:

http://sankalplabs.wordpress.com/2014/08/25/installing-apache-spark-on-windows-step-by-step-approach/

启动spark Shell时,我得到以下异常:

ERROR Remoting: Remoting error: [Startup failed] [
akka.remote.RemoteTransportException: Startup failed
        at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala
:129)
        at akka.remote.Remoting.start(Remoting.scala:194)
        at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:
184)
        at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
        at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:10
4)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
        at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:
957)
        at $line3.$read$$iwC$$iwC.<init>(<console>:8)
        at $line3.$read$$iwC.<init>(<console>:14)
        at $line3.$read.<init>(<console>:16)
        at $line3.$read$.<init>(<console>:20)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.<init>(<console>:7)
        at $line3.$eval$.<clinit>(<console>)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:
788)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:
1056)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614
)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:7
96)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca
la:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply
(SparkILoopInit.scala:121)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply
(SparkILoopInit.scala:120)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
        at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop
Init.scala:120)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)

        at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mc
Z$sp$5.apply$mcV$sp(SparkILoop.scala:913)
        at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s
cala:142)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
        at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL
oopInit.scala:104)
        at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:
56)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(Spar
kILoop.scala:930)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.
scala:884)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.
scala:884)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass
Loader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: Oleander
/192.168.1.7:0
        at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:2
72)
        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne
ttyTransport.scala:391)
        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne
ttyTransport.scala:388)
        at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
        at scala.util.Try$.apply(Try.scala:161)
        at scala.util.Success.map(Try.scala:206)
        at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
        at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(Ba
tchingExecutor.scala:67)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(Batc
hingExecutor.scala:82)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe
cutor.scala:59)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe
cutor.scala:59)
        at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72
)
        at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(Abst
ractDispatcher.scala:386)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool
.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:19
79)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThre
ad.java:107)
Caused by: java.net.BindException: Cannot assign requested address: bind
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Unknown Source)
        at sun.nio.ch.Net.bind(Unknown Source)
        at sun.nio.ch.ServerSocketChannelImpl.bind(Unknown Source)
        at sun.nio.ch.ServerSocketAdaptor.bind(Unknown Source)
        at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(Nio
ServerBoss.java:193)
        at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQue
ue(AbstractNioSelector.java:366)
        at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNi
oSelector.java:290)
        at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.ja
va:42)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
]
org.jboss.netty.channel.ChannelException: Failed to bind to: Oleander/192.168.1.
7:0
        at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:2
72)
        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne
ttyTransport.scala:391)
        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne
ttyTransport.scala:388)
        at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
        at scala.util.Try$.apply(Try.scala:161)
        at scala.util.Success.map(Try.scala:206)
        at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
        at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(Ba
tchingExecutor.scala:67)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(Batc
hingExecutor.scala:82)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe
cutor.scala:59)
        at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe
cutor.scala:59)
        at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72
)
        at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(Abst
ractDispatcher.scala:386)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool
.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:19
79)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThre
ad.java:107)
Caused by: java.net.BindException: Cannot assign requested address: bind
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Unknown Source)
        at sun.nio.ch.Net.bind(Unknown Source)
        at sun.nio.ch.ServerSocketChannelImpl.bind(Unknown Source)
        at sun.nio.ch.ServerSocketAdaptor.bind(Unknown Source)
        at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(Nio
ServerBoss.java:193)
        at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQue
ue(AbstractNioSelector.java:366)
        at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNi
oSelector.java:290)
        at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.ja
va:42)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)

对我来说,似乎描述不完整。缺少哪些步骤?是否有某个完整(简洁)的步骤列表在Windows 7上安装Spark?

非常感谢你的帮助,Felix

2 个答案:

答案 0 :(得分:5)

当我添加配置文件&#39; spark-env.cmd&#39;时,异常消失了。进入&#39; conf&#39;夹。在文件中我指定了本地IP地址(我通过ipconfig -all获得),如下所示:set SPARK_LOCAL_IP = 192.168.1.111

或者设定: SPARK_LOCAL_IP = LOCALHOST

答案 1 :(得分:0)

对于这个类似的错误,检查环境变量here

的配置可能会有所帮助

总而言之,安装spark时默认情况下不存在conf / spark-env.sh(或Windows上的conf / spark-env.cmd)。因此,为了修改环境变量,首先将模板文件conf / spark-env.sh.template复制到conf / spark-env.sh。在spark-env.sh文件中找到行

# - SPARK_LOCAL_IP, to set the IP address Spark binds to on this node

并将其更改为

SPARK_LOCAL_IP=LOCALHOST