sbt - 将SOME libraryDependencies复制到输出lib文件夹

时间:2014-07-03 15:33:50

标签: scala sbt apache-spark

使用sbt,我想将一些依赖关系jar复制到lib输出文件夹。如果可能的话,我想使用%provided%关键字,就像我可以使用sbt-assembly一样。

因此,如果build.sbt与以下内容有些相似,那么如何创建一个复制ark-tweet-nlp但是spark-core依赖关系到target/scala-%ver%/lib的任务?

retrieveManaged := true只是复制一切,这不是我想要的。

...
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0" % "provided"

libraryDependencies += "edu.cmu.cs" % "ark-tweet-nlp" % "0.3.2"

retrieveManaged := true
...

1 个答案:

答案 0 :(得分:1)

你可以写这样的任务。

build.sbt

val retrieveNotProvided = taskKey[Unit]("Copies non provided and non internal dependencies")

def isInternalOrProvided(conf: String) = conf.contains("-internal") || conf == "provided"

retrieveNotProvided := {
  val toCopy = new collection.mutable.HashSet[(File, File)]
  val pattern = retrievePattern.value
  val output = managedDirectory.value
  update.value.retrieve { (conf, mid, art, cached) =>
    import org.apache.ivy.core.IvyPatternHelper
    val fileName = IvyPatternHelper.substitute(
      pattern, mid.organization, mid.name, mid.revision, art.name, art.`type`, art.extension, conf
    )
    if (!isInternalOrProvided(conf)) toCopy += (cached -> output / fileName)
    cached
  }
  IO.copy(toCopy)
}

您必须从retrieveManaged := true中移除build.sbt,否则sbt将触发原始检索功能。