运行时NoClassDefFoundError:org / apache / spark / SparkConf,尽管编译正常

时间:2016-09-05 13:43:27

标签: scala intellij-idea apache-spark sbt

我已经尝试了以下简单的代码:

public void getTotalRate() {

    int totalRate = 0;
    for (int i = 0; i < list.size(); i++) {
        HashMap temp = list.get(i);
        /// here fourth column is rate
        int rate = (int) temp.get(FOURTH_COLUMN);
        totalRate = totalRate + rate;
    }

    Toast.makeText(getApplicationContext(), "total rate=" + totalRate, Toast.LENGTH_LONG).show();
}

它能够编译好,但是当我运行它时,我得到:

import org.apache.spark.SparkConf

object WordCounter {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setAppName("Word Counter").setMaster("local")
  }
}

这是非常令人惊讶的,因为(1)我在那里得到了Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf 声明,(2)这是以前的工作。

我很确定这是因为我最近以下列方式更改了import个文件:

build.sbt

sbt

的项目/ assembly.sbt

name := "SparkPlayground"

version := "1.0"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % "provided"
libraryDependencies += "com.github.scala-incubator.io" %% "scala-io-core" % "0.4.3"
libraryDependencies += "com.github.scala-incubator.io" %% "scala-io-file" % "0.4.3"
libraryDependencies += "org.apache.hadoop" % "hadoop-streaming" % "2.7.0"

//added below 2 lines:
assemblyJarName in assembly := s"${name.value.replace(' ','-')}-${version.value}.jar"
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)

我对sbt很新,但我认为最新进展的是我需要创建一种类型的超级jar并从该上下文中运行所有内容。相反,我直接调用生成的类。但是,我没有在Intellij项目中看到任何超级jar。

如何正确构建和运行此项目,最好是通过Intellij选项而不是手动SBT控制台命令?

0 个答案:

没有答案