我正在构建一个SBT多项目项目,该项目包含common
模块和logic
模块以及logic.dependsOn(common)
。
在common
中,SparkSQL 2.2.1(" org.apache.spark" %%" spark-sql"%" 2.2.1" )介绍。在logic
中,也使用了SparkSQL,但是我得到了编译错误,说"对象spark不是org.apache包的成员"。
现在,如果我将logic
的SparkSQL依赖项添加为"org.apache.spark" %% "spark-sql" % "2.2.1"
,它就可以了。但是,如果我添加"org.apache.spark" %% "spark-sql" % "2.2.1" % Provided"
,我会收到同样的错误。
我不知道为什么会发生这种情况,为什么依赖关系无法从common
转移到logic
这是root sbt文件:
lazy val commonSettings = Seq(
organization := "...",
version := "0.1.0",
scalaVersion := "2.11.12",
resolvers ++= Seq(
clojars,
maven_local,
novus,
twitter,
spark_packages,
artima
),
test in assembly := {},
assemblyMergeStrategy in assembly := {...}
)
lazy val root = (project in file(".")).aggregate(common, logic)
lazy val common = (project in file("common")).settings(commonSettings:_*)
lazy val logic = (project in file("logic")).dependsOn(common).settings(commonSettings:_*)
这是逻辑模块sbt文件:
libraryDependencies ++= Seq(
spark_sql.exclude("io.netty", "netty"),
embedded_elasticsearch % "test",
scalatest % "test"
)
dependencyOverrides ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-annotation" % "2.6.5",
"org.json4s" %% "json4s-jackson" % "3.2.11"
)
assemblyJarName in assembly := "***.jar"