尝试在IntelliJ中添加spark依赖项时,OpenJDK Server VM和未解析的依赖项警告

时间:2018-01-23 05:46:59

标签: scala apache-spark intellij-idea

添加spark-dependencies时出现以下错误:

Error while importing sbt project:

OpenJDK Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0

::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.12;1.6.1: not found
[warn]  :: org.apache.spark#spark-streaming_2.12;1.6.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]      org.apache.spark:spark-core_2.12:1.6.1 (/home/chiralcarbon/IdeaProjects/pleasework/build.sbt#L8-12)
[warn]        +- default:pleasework_2.12:0.1
[warn]      org.apache.spark:spark-streaming_2.12:1.6.1 (/home/chiralcarbon/IdeaProjects/pleasework/build.sbt#L8-12)
[warn]        +- default:pleasework_2.12:0.1

这是build.sbt文件:

 name := "pleasework"

    version := "0.1"

    scalaVersion := "2.12.4"
    val sparkVersion = "1.6.1"

    libraryDependencies ++= Seq(
      "org.apache.spark" %% "spark-core" % sparkVersion,
      "org.apache.spark" %% "spark-streaming" % sparkVersion
    )

如何解决这些错误?

1 个答案:

答案 0 :(得分:2)

还没有为Scala 2.12发布Spark。您需要使用Scala 2.11(我认为2.11.11是最新版本)和spark-core_2.11(SBT在您使用%%运算符时为您添加" _2.11"后缀这就是为什么你的编译器警告无法找到' spark-core_2.12')的原因。通常,您通常可以通过查看Maven Central Repository来查找最新版本的Java或Scala库。