为什么Scala编译器失败,“包中的对象SparkConf无法在org.apache.spark包中访问”?

时间:2015-12-05 17:35:06

标签: scala apache-spark sbt

我无法访问包中的SparkConf。但我已导入import org.apache.spark.SparkConf。我的代码是:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._

object SparkStreaming {
    def main(arg: Array[String]) = {

        val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
        val ssc = new StreamingContext( conf, Seconds(1) )

        val lines = ssc.socketTextStream("localhost", 9999)
        val words = lines.flatMap(_.split(" "))
        val pairs_new = words.map( w => (w, 1) )
        val wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print() 

        ssc.start() // Start the computation
        ssc.awaitTermination() // Wait for the computation to the terminate

    }
}

sbt依赖项是:

name := "Spark Streaming"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
    "org.apache.spark" %% "spark-mllib" % "1.5.2",
    "org.apache.spark" %% "spark-streaming" % "1.5.2"
)

但错误显示无法访问SparkConf

[error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark
[error]         val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
[error]                        ^

2 个答案:

答案 0 :(得分:6)

如果在SparkConf之后添加括号,则编译:

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

关键是SparkConf是一个类而不是函数,因此您也可以将类名用于范围目的。因此,当您在类名后添加括号时,您确保调用类构造函数而不是作用域功能。以下是Scala shell中的示例,说明了不同之处:

scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}}
defined class C1

scala> new C1
res18: C1 = $iwC$$iwC$C1@2d33c200

scala> new C1()
res19: C1 = $iwC$$iwC$C1@30822879

scala> new C1.setAge(30)  // this doesn't work

<console>:23: error: not found: value C1
          new C1.setAge(30)
              ^

scala> new C1().setAge(30) // this works

scala> 

答案 1 :(得分:1)

在这种情况下,您不能省略括号,因此它应该是:

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")