Apache Spark安装错误。

时间:2018-05-15 13:44:22

标签: apache-spark spark-streaming

我可以在 ubuntu 16 上使用给定的命令集安装Apache spark:

dpkg -i scala-2.12.1.deb 
mkdir /opt/spark 
tar -xvf spark-2.0.2-bin-hadoop2.7.tgz 
cp -rv spark-2.0.2-bin-hadoop2.7/* /opt/spark 
cd /opt/spark 

执行spark shell运作良好

./bin/spark-shell --master local[2] 

在shell上返回此输出:

jai@jaiPC:/opt/spark$ ./bin/spark-shell --master local[2]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
18/05/15 19:00:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/05/15 19:00:55 WARN Utils: Your hostname, jaiPC resolves to a loopback address: 127.0.1.1; using 172.16.16.46 instead (on interface enp4s0)
18/05/15 19:00:55 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/05/15 19:00:55 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.16.16.46:4040
Spark context available as 'sc' (master = local[2], app id = local-1526391055793).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

但是当我试图访问时

Spark context Web UI available at http://172.16.16.46:4040

显示

The page cannot be displayed

如何解决此问题

请帮忙:

谢谢和问候

0 个答案:

没有答案