Spark web UI无法访问

时间:2016-08-30 11:40:31

标签: ubuntu apache-spark ssh cluster-computing apache-spark-standalone

我在12个节点上安装了spark2.0.0(在群集独立模式下),当我启动它时,我得到了这个:

./sbin/start-all.sh
  

启动org.apache.spark.deploy.master.Master,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy .master.Master -1- ibnb25.out

     

localhost192.17.0.17:ssh:无法解析主机名localhost192.17.0.17:名称或服务未知

     

192.17.0.20:启动org.apache.spark.deploy.worker.Worker,登录到/home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb28.out

     

192.17.0.21:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb29.out

     

192.17.0.19:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb27.out

     

192.17.0.18:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb26.out

     

192.17.0.24:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb32.out

     

192.17.0.22:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb30.out

     

192.17.0.25:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb33.out

     

192.17.0.28:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb36.out

     

192.17.0.27:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb35.out

     

192.17.0.17:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb25.out

     

192.17.0.26:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb34.out

     

192.17.0.23:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache .spark.deploy.worker.Worker -1- ibnb31.out

我已经设置了端口o主端口= 8081,其IP = 192.17.0.17表示HOSTNAME = ibnb25,我从该主机启动了集群。

从我的本地机器

我使用此命令访问集群

 ssh mName@xx.xx.xx.xx 

当我想从本地计算机访问Web UI时,我使用了主服务器的IP地址(HOST ibnb25)

192.17.0.17:8081

但无法显示,所以我尝试使用我用来访问群集的地址

xx.xx.xx.xx:8081

但我的浏览器上没有显示任何内容.....出了什么问题?请你帮帮我

1 个答案:

答案 0 :(得分:0)

您的/ etc / hosts文件似乎设置不正确。

您应该使用以下命令获取主机名和IP:

hostname
hostname -i

确保主机名和IP之间有空格。

示例/ etc / hosts文件如下所示:

192.17.0.17  <hostname>
192.17.0.17  localhost
<Other IP1>  <other hostname1>
.
.
.
<Other IP-n>  <other hostname-n>

确保在/ etc / hosts文件中的每个节点上拥有群集中的所有IP主机条目。

对于FQDN,请阅读this