我已经从" https://spark.apache.org/downloads.html"
下载了火花但是无法让它运行,当我尝试以独立模式启动它时会出现以下错误,如此处所提及的那样" https://spark.apache.org/docs/latest/spark-standalone.html"。请建议
adminisatorsmbp:spark-2.0.1-bin-hadoop2.7 amit$ ./sbin/start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /Users/amit/Documents/Analytics/kaggle/Bosch/spark-2.0.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-adminisatorsmbp.out
failed to launch org.apache.spark.deploy.master.Master:
at java.net.InetAddress.getLocalHost(InetAddress.java:1471)
... 10 more
full log in /Users/amit/Documents/Analytics/kaggle/Bosch/spark-2.0.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-adminisatorsmbp.out
adminisatorsmbp:spark-2.0.1-bin-hadoop2.7 amit$
以下是完整的日志
Spark Command: /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java -cp /Users/amit/Documents/Analytics/kaggle/Bosch/spark-2.0.1-bin-hadoop2.7/conf/:/Users/amit/Documents/Analytics/kaggle/Bosch/spark-2.0.1-bin-hadoop2.7/jars/* -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --host adminisatorsmbp --port 7077 --webui-port 8080
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/10/24 17:40:26 INFO Master: Started daemon with process name: 35702@localhost
16/10/24 17:40:26 INFO SignalUtils: Registered signal handler for TERM
16/10/24 17:40:26 INFO SignalUtils: Registered signal handler for HUP
16/10/24 17:40:26 INFO SignalUtils: Registered signal handler for INT
Exception in thread "main" java.net.UnknownHostException: adminisatorsmbp: adminisatorsmbp: nodename nor servname provided, or not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1475)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:866)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:859)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:859)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:916)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:916)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.localHostName(Utils.scala:916)
at org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:30)
at org.apache.spark.deploy.master.Master$.main(Master.scala:1010)
at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.net.UnknownHostException: adminisatorsmbp: nodename nor servname provided, or not known
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1295)
at java.net.InetAddress.getLocalHost(InetAddress.java:1471)
... 10 more
答案 0 :(得分:0)
在conf文件夹中,您需要将spark-env.sh.template重命名为spark-env.sh,并将slaves.template重命名为slave。 并尝试上面的步骤,并确保在/ etc / hosts中你有正确的ip和主机名映射。