我在intellij idea 2017.1.6 ide中的spark scala项目中使用了SBT 1.8.0。我想创建一个父项目及其子项目模块。到目前为止,这就是我的build.sbt中的内容:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
现在,我可以从父模块中的spark-streaming或spark-hive库依赖项中导入任何类,但不能在任何子模块中导入和使用它们。只有在任何子模块中将它们明确指定为库依赖项时,我才可以使用它们。
请为我提供解决这些问题的解决方案。
答案 0 :(得分:2)
我的多模块项目仅将父项目用于构建所有内容,然后将运行委托给“服务器”项目:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
我将设置(例如依赖项)分组到另一个文件中,例如:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
现在每个子模块只需添加所需的内容,例如:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
您在此处找到的整个项目:https://github.com/pme123/scala-adapters
有关依赖性,请参见project/Settings
文件。
答案 1 :(得分:0)
在 provided->provided
中使用 dependsOn
帮助我解决了类似的问题:
比如:
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)