如何将Spark测试包含为Maven依赖项

时间:2015-10-24 22:03:12

标签: maven apache-spark

我继承了依赖于

的旧代码
org.apache.spark.LocalSparkContext 

这是火花核心测试。但是火花核心jar(正确)不包括仅测试类。我无法确定火花测试类是否具有自己的maven工件。这里的正确方法是什么?

3 个答案:

答案 0 :(得分:5)

您可以通过添加test-jar向Spark的<type>test-jar</type>添加依赖项。例如,对于基于Scala 2.11的Spark 1.5.1:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>1.5.1</version>
  <type>test-jar</type>
  <scope>test</scope>
</dependency>

此依赖项提供Spark的所有测试类,包括LocalSparkContext

答案 1 :(得分:1)

我来到这里希望能在SBT中找到相同的灵感。作为其他SBT用户的参考:将pattern of using test-jars in SBT应用于Spark 2.0会导致:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0" classifier "tests"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.0.0" classifier "tests"

答案 2 :(得分:0)

如果要添加测试罐,可以按照以下说明继续添加SBT:


version := "0.1"

scalaVersion := "2.11.11"
val sparkVersion = "2.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % Provided,
  "org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "test-sources",
  "org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
  "org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "test-sources",
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "test-sources",
  "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0",
  "org.scalatest" %% "scalatest" % "3.0.4" % "test")

如果要在MAVEN依赖项上添加它,也可以这样做,如下所述:


    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.parent.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.parent.version}</artifactId>
        <version>${spark.version}</version>
        <classifier>tests</classifier>
        <type>test-jar</type>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.parent.version}</artifactId>
        <version>${spark.version}</version>
        <classifier>test-sources</classifier>
        <type>test-jar</type>
        <scope>test</scope>
    </dependency>

<dependencies>