我的build.sbt看起来像这样:
import sbt._
name := "spark-jobs"
version := "0.1"
scalaVersion := "2.11.8"
resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
// additional libraries
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.2.0" % "provided",
"org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided",
"org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.2.0"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
过去一直有效,直到我决定看看如果我在% "provided"
末尾添加另一个spark-streaming_2.11
会发生什么。它无法解决依赖关系,我继续前进并恢复了更改。但是,它之后似乎也给了我一个例外。现在我的build.sbt看起来就像以前一切工作时一样。不过,它给了我这个例外:
[error] (*:update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-streaming_2.11;2.2.0: org.apache.spark#spark-parent_2.11;2.2.0!spark-parent_2.11.pom(pom.original) origin location must be absolute: file:/home/aswin/.m2/repository/org/apache/spark/spark-parent_2.11/2.2.0/spark-parent_2.11-2.2.0.pom
SBT的行为对我来说有点混乱。有人可以指导我为什么会发生这种情况?任何好的博客/资源都可以了解SBT究竟是如何工作的。
这是我的项目/ assembly.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6")
项目/ build.properties:
sbt.version = 1.0.4
项目/ plugins.sbt:
resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
谢谢!
答案 0 :(得分:0)
如果您使用的是sbt控制台,只需运行reload
命令,然后重试。更新依赖项或sbt插件后,需要重新加载项目以使更改生效。
顺便说一下,您可以只使用%%
运算符,而不是在依赖项中定义Scala版本,它将根据您定义的scala版本获取相应的依赖项。
// additional libraries
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.2.0",
"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"
)