首先,我通过命令
在本地启动了一个主服务器./sbin/start-master.sh
工作正常。我可以通过SPARK_MASTER_IP:8080
中的浏览器从主服务器和第二台计算机(我打算作为工作人员添加的计算机)访问Web UI。
然后我运行命令
./bin/spark-class org.apache.spark.deploy.worker.Worker spark://Williams-MacBook-Air.local:7077
(“spark://Williams-MacBook-Air.local:7077”就是我在网页界面中看到的,我也可以使用这个地址来启动Scala / Python shell。)
哪个不起作用。这是我在控制台中看到的内容:
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
14/07/08 15:59:18 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/07/08 15:59:18 INFO SecurityManager: Changing view acls to: williamzhang
14/07/08 15:59:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(williamzhang)
14/07/08 15:59:19 INFO Slf4jLogger: Slf4jLogger started
14/07/08 15:59:19 INFO Remoting: Starting remoting
14/07/08 15:59:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkWorker@172.25.83.121:55179]
14/07/08 15:59:19 INFO Worker: Starting Spark worker 172.25.83.121:55179 with 8 cores, 7.0 GB RAM
14/07/08 15:59:19 INFO Worker: Spark home: /Users/williamzhang/spark-1.0.0
14/07/08 15:59:19 INFO WorkerWebUI: Started WorkerWebUI at http://172.25.83.121:8081
14/07/08 15:59:19 INFO Worker: Connecting to master spark://Williams-MacBook-Air.local:7077...
14/07/08 15:59:24 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkMaster@Williams-MacBook-Air.local:7077]. Address is now gated for 60000 ms, all messages to this address will be delivered to dead letters.
14/07/08 15:59:24 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef: Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from Actor[akka://sparkWorker/user/Worker#-1453195170] to Actor[akka://sparkWorker/deadLetters] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
14/07/08 15:59:39 INFO Worker: Connecting to master spark://Williams-MacBook-Air.local:7077...
14/07/08 15:59:39 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef: Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from Actor[akka://sparkWorker/user/Worker#-1453195170] to Actor[akka://sparkWorker/deadLetters] was not delivered. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
14/07/08 15:59:59 INFO Worker: Connecting to master spark://Williams-MacBook-Air.local:7077...
14/07/08 15:59:59 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef: Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from Actor[akka://sparkWorker/user/Worker#-1453195170] to Actor[akka://sparkWorker/deadLetters] was not delivered. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
14/07/08 16:00:19 ERROR Worker: All masters are unresponsive! Giving up.
我也尝试过:
使用实际的IP地址而不是我在网络用户界面中看到的
重新启动主
将第二台计算机用作主计算机,将第一台计算机用作工作程序
这两台机器都安装了Oracle Java 8 64位。第一个运行OS X v10.9(Mavericks),第二个运行OS X v10.10(Yosemite)预览。
答案 0 :(得分:1)
不确定问题,但我认为添加工作人员的常用方法是:
sbin/start-slave.sh <worker#> <master-spark-URL>
至少这对我有用。也许您的调用问题在于您没有设置工号。
答案 1 :(得分:1)
我将新worker的主机名添加到名为slaves的主文件中。 然后,我在master“sbin / start-slaves.sh”中运行脚本。
它将显示存在的工作程序正在运行,新工作程序正在启动。 它也将在Web UI中显示。
答案 2 :(得分:1)
我希望现在还为时不晚。尝试通过以下命令运行master:&#34; start-master.sh -h IP_OF_MASTER&#34;然后运行奴隶。就我而言,它有所帮助。
答案 3 :(得分:0)
要告诉Spark在每台从机上运行4个工作程序,我们将创建一个新的配置文件:
# create spark-env.sh file using the provided template:
cp ./conf/spark-env.sh.template ./conf/spark-env.sh
# append a configuration param to the end of the file:
echo "export SPARK_WORKER_INSTANCES=4" >> ./conf/spark-env.sh
然后以
开始火花 ./sbin/start-mastet.sh
./sbin/start-slaves.sh
或
./start-all.sh
结果应如下所示