Apache spark代码问题

时间:2017-11-02 09:59:45

标签: scala apache-spark

代码未运行

阶> import org.apache.spark.SparkContext     import org.apache.spark.SparkContext

scala> import org.apache.spark.SparkContext.
| import org.apache.spark.
<console>:2: error: identifier expected but 'import' found.
import org.apache.spark.
^

scala> object SparkWordCount {
 |    def main(args: Array[String]) {
 |     val sc = new SparkContext( "local", "Word Count", "/usr/local/spark", Nil, Map(), Map())
 |       val input = sc.textFile("C:\\x.txt")
 |       Val count = input.flatMap(line ? line.split(" "))
 |       .map(word ? (word, 1))
 |       .reduceByKey(_ + _)
 |       count.saveAsTextFile("C:\\x1.txt")
 |       System.out.println("OK");
 |    }
 | }

输入后我

<console>:48: error: overloaded method constructor SparkContext with alternative
s:  (master: String,appName: String,sparkHome: String,jars: Seq[String],environment:scala.collection.Map[String,String])org.apache.spark.    SparkContext <and> (master: String,appName:    String,conf:org.apache.spark.SparkConf)org.apache.spark.SparkContext <and> ()org.apache.spark.SparkContext <and>
  (config: org.apache.spark.SparkConf)org.apache.spark.SparkContext
 cannot be applied to (String, String, String, scala.collection.immutable.Nil.type, scala.collection.immutable.Map[Nothing,Nothing], scala.collection.immutable.Map[Nothing,Nothing])
       val sc = new SparkContext( "local", "Word Count", "/usr/local/spark",Nil, Map(), Map())




<console>:50: error: not found: value Val
         Val count = input.flatMap(line ? line.split(" "))
         ^
<console>:53: error: ambiguous reference to overloaded definition,both method count in object functions of type (columnName: String)org.apache.spark.sql.TypedColumn[Any,Long] and  method count in object functions of type (e: org.apache.spark.sql.Column)org.apache.spark.sql.Column match expected type ? count.saveAsTextFile("C:\\x1.txt")

当我尝试导入文本文件时,它还没有发生

scala> val file = sc.textFile("c:\\x.txt")
<console>:46: error: not found: value sc
   val file = sc.textFile("c:\\x.txt")
              ^

当我尝试使用单曲&#34; /&#34;然后我得到错误无效的转义字符,当我尝试用&#34; //&#34;然后什么都没有显示

scala> val file = sc.textFile("c:\x.txt")
<console>:1: error: invalid escape character
val file = sc.textFile("c:\x.txt")
                       ^

scala>

请帮助我,我不明白为什么会这样?我在Windows上安装了火花

1 个答案:

答案 0 :(得分:1)

您的代码非常多。请尝试以下代码

import org.apache.spark._
object SparkWordCount {
    def main(args: Array[String]) {
      val conf = new SparkConf()
      val sc = new SparkContext(conf.setMaster("local").setAppName("Word Count")))
      val input = sc.textFile("C:\\x.txt")
      val count = input.flatMap(line => line.split(" "))
       .map(word => (word, 1))
       .reduceByKey(_ + _)
      count.saveAsTextFile("C:\\x1.txt")
      System.out.println("OK");
    }
}

我希望它有效