value wholeTextFiles不是org.apache.spark.SparkContext的成员

时间:2015-06-16 05:31:29

标签: scala apache-spark

我有一个Scala代码如下: -

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark._

object RecipeIO {

val sc = new SparkContext(new SparkConf().setAppName("Recipe_Extraction"))   

def read(INPUT_PATH: String): org.apache.spark.rdd.RDD[(String)]= {


 val data = sc.wholeTextFiles("INPUT_PATH")
 val files = data.map { case (filename, content) => filename}
 (files)

 }
}

当我使用sbt编译此代码时,它给出了错误:
value wholeTextFiles不是org.apache.spark.SparkContext的成员 我正在导入所有这些都是必需的,但它仍然给我这个错误 但是当我通过将 wholeTextFiles 替换为 textFile 来编译此代码时,代码
会被编译。
这可能是什么问题,我该如何解决? 在此先感谢!

环境

Scala编译器版本2.10.2
火花1.2.0

错误

[info] Set current project to RecipeIO (in build file:/home/akshat/RecipeIO/)
[info] Compiling 1 Scala source to /home/akshat/RecipeIO/target/scala-2.10.4/classes...
[error] /home/akshat/RecipeIO/src/main/scala/RecipeIO.scala:14: value wholeTexFiles is not a member of org.apache.spark.SparkContext
[error]  val data = sc.wholeTexFiles(INPUT_PATH)
[error]                ^
[error] one error found
[error] {file:/home/akshat/RecipeIO/}default-55aff3/compile:compile: Compilation failed
[error] Total time: 16 s, completed Jun 15, 2015 11:07:04 PM

我的build.sbt文件如下所示:

name := "RecipeIO"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "0.9.0-incubating"

libraryDependencies += "org.eclipse.jetty" % "jetty-server" % "8.1.2.v20120308" 

ivyXML := 
<dependency org="org.eclipse.jetty.orbit" name="javax.servlet" rev="3.0.0.v201112011016">
<artifact name="javax.servlet" type="orbit" ext="jar"/>
</dependency>

1 个答案:

答案 0 :(得分:-1)

你有一个拼写错误:它应该是wholeTextFiles而不是wholeTexFiles

作为旁注,如果您真的想使用sc.wholeTextFiles(INPUT_PATH)变量,我认为您需要sc.wholeTextFiles("INPUT_PATH")而不是INPUT_PATH