如何使用sbt-assembly排除测试依赖项

时间:2019-04-02 08:18:22

标签: scala sbt sbt-assembly

我有一个sbt项目,我正在尝试使用sbt-assembly插件将其构建到jar中。

build.sbt:

      name := "project-name"

      version := "0.1"

      scalaVersion := "2.11.12"

      val sparkVersion = "2.4.0"

      libraryDependencies ++= Seq(
        "org.scalatest" %% "scalatest" % "3.0.5" % "test",
        "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
        "org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
        "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
        "com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test",
        // spark-hive dependencies for DataFrameSuiteBase. https://github.com/holdenk/spark-testing-base/issues/143
        "org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
        "com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
        "com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
        "com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
        //"org.apache.hadoop" % "hadoop-aws" % "3.1.1"
        "org.json" % "json" % "20180813"
      )

      assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
      assemblyMergeStrategy in assembly := {
       case PathList("META-INF", xs @ _*) => MergeStrategy.discard
       case x => MergeStrategy.first
      }
      test in assembly := {}

      // https://github.com/holdenk/spark-testing-base
      fork in Test := true
      javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
      parallelExecution in Test := false

当我使用sbt程序集构建项目时,生成的jar包含/ org / junit / ...和/ org / opentest4j / ...文件

有没有办法在最终的jar中不包含这些与测试相关的文件?

我尝试替换该行:

    "org.scalatest" %% "scalatest" % "3.0.5" % "test"

具有:

    "org.scalatest" %% "scalatest" % "3.0.5" % "provided"

我还想知道文件是如何包含在jar中的,因为build.sbt中没有引用junit(但是项目中有junit测试)?

已更新:

    name := "project-name"

    version := "0.1"

    scalaVersion := "2.11.12"

    val sparkVersion = "2.4.0"

    val excludeJUnitBinding = ExclusionRule(organization = "junit")

    libraryDependencies ++= Seq(
      // Provided
      "org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
      "org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
      "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
      "com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "provided" excludeAll(excludeJUnitBinding),
      "org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
      "com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
      "com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
      "com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",

      // Test
      "org.scalatest" %% "scalatest" % "3.0.5" % "test",

      // Necessary
      "org.json" % "json" % "20180813"
    )

    excludeDependencies += excludeJUnitBinding

    // https://stackoverflow.com/questions/25144484/sbt-assembly-deduplication-found-error
    assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
    assemblyMergeStrategy in assembly := {
     case PathList("META-INF", xs @ _*) => MergeStrategy.discard
     case x => MergeStrategy.first
    }


    // https://github.com/holdenk/spark-testing-base
    fork in Test := true
    javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
    parallelExecution in Test := false

1 个答案:

答案 0 :(得分:1)

要排除某个依赖项的某些传递性依赖项,请使用excludeAllexclude方法。

将为项目发布pom时,应使用exclude方法。它要求排除组织和模块名称。

例如:

libraryDependencies += 
  "log4j" % "log4j" % "1.2.15" exclude("javax.jms", "jms")

excludeAll方法更灵活,但是由于无法在pom.xml中表示,因此仅应在不需要生成pom时使用。

例如,

libraryDependencies +=
  "log4j" % "log4j" % "1.2.15" excludeAll(
    ExclusionRule(organization = "com.sun.jdmk"),
    ExclusionRule(organization = "com.sun.jmx"),
    ExclusionRule(organization = "javax.jms")
  )

在某些情况下,应将传递依赖项从所有依赖项中排除。这可以通过在excludeDependencies中设置ExclusionRules(对于sbt 0.13.8及更高版本)来实现。

excludeDependencies ++= Seq(
  ExclusionRule("commons-logging", "commons-logging")
)

JUnit jar文件的下载是以下依赖项的一部分。

"org.apache.spark" %% "spark-core" % sparkVersion % "provided" //(junit)
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided"// (junit)
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test" //(org.junit)

要排除junit文件,请如下更新您的依赖关系。

val excludeJUnitBinding = ExclusionRule(organization = "junit")

  "org.scalatest" %% "scalatest" % "3.0.5" % "test",
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
  "org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
  "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
  "com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test" excludeAll(excludeJUnitBinding)

更新: 请按如下所示更新您的build.abt。

resolvers += Resolver.url("bintray-sbt-plugins",
  url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)

val excludeJUnitBinding = ExclusionRule(organization = "junit")

libraryDependencies ++= Seq(
  // Provided
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
  "org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
  "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
  "com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "provided" excludeAll(excludeJUnitBinding),
  "org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
  //"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
  //"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
  //"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",

  // Test
  "org.scalatest" %% "scalatest" % "3.0.5" % "test",

  // Necessary
  "org.json" % "json" % "20180813"
)

assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false

plugin.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

我已经尝试过,并且没有下载junit jar文件。enter image description here