由于scala 2.12.1,IntelliJ中的Spark字计数错误

时间:2017-01-08 10:47:06

标签: scala apache-spark intellij-idea

  

线程“main”中的异常java.lang.NoClassDefFoundError:   scala /产品$ class at   org.apache.spark.SparkConf $ DeprecatedConfig(SparkConf.scala:762)。   在org.apache.spark.SparkConf $。(SparkConf.scala:615)at   org.apache.spark.SparkConf $。(SparkConf.scala)at   org.apache.spark.SparkConf.set(SparkConf.scala:84)at   org.apache.spark.SparkConf.set(SparkConf.scala:73)at   org.apache.spark.SparkConf.setMaster(SparkConf.scala:105)

package bigdata.spark_applications
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object WordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local").setAppName("WordCount")
    val sc = new SparkContext(conf)
    val data = sc.textFile("C:\\Users\\scala.txt")
    val result = data.flatMap(_.split(" ")).map(word => (word , 1)).reduceByKey(_ + _)
    result.collect.foreach(println)
  }

1 个答案:

答案 0 :(得分:5)

您好我得到了答案,

我使用scala 2.12.1并且spark-core不适用于2.12.1所以在项目中我使用scala 2.11.8并将spark-core更改为2.11依赖

版本:=“1.0”

scalaVersion:=“2.11.8”

libraryDependencies + =“org.apache.spark”%“spark-core_2.11”%“2.1.0” libraryDependencies + =“org.apache.spark”%“spark-sql_2.11”%“2.1.0”