sbt运行后SparkContext未初始化

时间:2016-06-22 05:16:16

标签: scala sbt

我的build.sbt文件位于下方:

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)

我在src/main/scala/example.scala中也有example.scala:

import org.apache.spark._
import org.apache.spark.SparkContext._

object WordCount {
    def main(args: Array[String]) {
      val conf = new SparkConf().setAppName("wordCount").setMaster("local")
      val sc = new SparkContext(conf)
      val input =  sc.textFile("food.txt")
      val words = input.flatMap(line => line.split(" "))
      val counts = words.map(word => (word, 1)).reduceByKey{case (x, y) => x + y}
      counts.saveAsTextFile("output.txt")
    }
}

由于某种原因,当我在根目录(sbt run)中src/main/scala时,我收到错误:

[info] Running WordCount 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/06/21 22:05:08 INFO SparkContext: Running Spark version 1.6.1
16/06/21 22:05:08 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/21 22:05:09 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: LM-SFA-11002982: LM-SFA-11002982: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1475)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:788)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:781)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:781)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:838)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:838)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:838)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:420)
    at WordCount$.main(exam.scala:8)
    at WordCount.main(exam.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at sbt.Run.invokeMain(Run.scala:67)
    at sbt.Run.run0(Run.scala:61)
    at sbt.Run.sbt$Run$$execute$1(Run.scala:51)
    at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:55)
    at sbt.Run$$anonfun$run$1.apply(Run.scala:55)
    at sbt.Run$$anonfun$run$1.apply(Run.scala:55)
    at sbt.Logger$$anon$4.apply(Logger.scala:84)
    at sbt.TrapExit$App.run(TrapExit.scala:248)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: LM-SFA-11002982: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1295)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1471)
    ... 23 more
16/06/21 22:05:09 INFO SparkContext: Successfully stopped SparkContext

有人可以向我解释此错误中所述的问题吗?这是因为我的依赖关系没有正确安装,还是因为另一个原因?

1 个答案:

答案 0 :(得分:5)

看起来系统的主机名无法解析为IP地址。

作为[非常蹩脚]的解决方法,您可以尝试:

echo "127.0.0.1 LM-SFA-11002982" | sudo tee -a /etc/hosts