我正在使用IntelliJ 2016.3版本。
import sbt.Keys._
import sbt._
object ApplicationBuild extends Build {
object Versions {
val spark = "1.6.3"
}
val projectName = "example-spark"
val common = Seq(
version := "1.0",
scalaVersion := "2.11.7"
)
val customLibraryDependencies = Seq(
"org.apache.spark" %% "spark-core" % Versions.spark % "provided",
"org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
"org.apache.spark" %% "spark-hive" % Versions.spark % "provided",
"org.apache.spark" %% "spark-streaming" % Versions.spark % "provided",
"org.apache.spark" %% "spark-streaming-kafka" % Versions.spark
exclude("log4j", "log4j")
exclude("org.spark-project.spark", "unused"),
"com.typesafe.scala-logging" %% "scala-logging" % "3.1.0",
"org.slf4j" % "slf4j-api" % "1.7.10",
"org.slf4j" % "slf4j-log4j12" % "1.7.10"
exclude("log4j", "log4j"),
"log4j" % "log4j" % "1.2.17" % "provided",
"org.scalatest" %% "scalatest" % "2.2.4" % "test"
)
我已经遇到了运行时异常。即使我正确地提到了所有依赖项,如上所示。
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext
at example.SparkSqlExample.main(SparkSqlExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SQLContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more
在此网站上调查了更多信息。发现这主要是由于buld.sbt
中的不合适条目或版本不匹配。但在我的情况下,一切看起来都很好,如上所示。
请指出我在哪里做错了?
答案 0 :(得分:4)
我想这是因为您将依赖项标记为“已提供”,但显然您(或IDEA)不提供它们。
尝试删除“提供”选项或(我的首选方式):使用main方法将类移至src/test/scala