如何根据配置文件在sbt中设置依赖范围?

时间:2018-02-01 04:19:30

标签: scala sbt dependency-management

如何根据不同的envs在build.sbt中设置依赖关系。例如:

libraryDependencies ++= "org.apache.spark" %% "spark-core" % sparkVersion % "compile", // expected  in dev
libraryDependencies ++= "org.apache.spark" %% "spark-core" % sparkVersion % "provided", // expected in prod

有什么建议吗?

2 个答案:

答案 0 :(得分:3)

您可以在系统属性中设置环境标识符,并在build.sbt中使用scala匹配来获得所需的结果。

您的build.sbt应如下所示: -

val mode = sys.env.getOrElse("EXEC_MODE", "dev") // can be hardcoded.
val devSparkVersion = 2.0.2
val prodSparkVersion = 1.6.2

mode match {
  case "dev"  => libraryDependencies += "org.apache.spark" %% "spark-core" % devSparkVersion
  case "prod" => libraryDependencies += "org.apache.spark" %% "spark-core" % prodSparkVersion
}

答案 1 :(得分:1)

我从未尝试过这个,但是根据这个文档:

http://www.scala-sbt.org/1.0/docs/Configuring-Scala.html

看起来如果你设置

autoScalaLibrary := false

然后你可以使用“test”,“compile”或“runtime”