我在build.sbt
文件中为apache-spark 2.11做了以下依赖。
name := "Project1"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-compiler" % "2.11.8",
"org.scala-lang" % "scala-reflect" % "2.11.8",
"org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4",
"org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4"
)
然而,Intellij无法解决spark-core_2.11
依赖关系。我尝试了多次,但没能成功。提前致谢。
答案 0 :(得分:0)
我在IntelliJ 2016.3.2中遇到了与Scala / Spark几乎相同的问题:
name := "some-project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
为了让它工作,我不得不手动将spark-core jar添加到我的项目库中,即: