启动Spark shell时出现java.net.UnknownHostException

时间:2018-01-14 13:38:34

标签: apache-spark pyspark

在此之后安装spark。 https://www.davidadrian.cc/posts/2017/08/how-to-spark-cluster/

我收到了这条消息:

n@jupyter:~$ spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
java.net.UnknownHostException: jupyter: jupyter: Name or service not known
  at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
  at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:891)
  at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:884)
  at e(Utils.scala:941)
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_112-release)

scala> 1+1
res0: Int = 2

我不确定为什么火花正在寻找jupyter类, 因为这只是shell启动。

编辑: 添加bashrc config。

export PATH=/home/noel/pycharm/jre/bin:$PATH

export HADOOP_HOME=/home/xxx/hadoop_275
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH

export SPARK_HOME=/home/xxx/spark/spark22_hadoop27
export PATH=$SPARK_HOME/bin:$PATH

etc / host:

127.0.0.1   localhost
#127.0.1.1  deep-learning

# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

0 个答案:

没有答案