Scalatest Maven插件“没有执行任何测试”

时间:2016-08-01 13:35:45

标签: scala maven apache-spark scalatest

我正在尝试在Maven上使用scalatestspark-testing-base进行集成测试Spark。 Spark作业读入CSV文件,验证结果,并将数据插入数据库。我试图通过放入已知格式的文件并查看它们是否以及如何失败来测试验证。这个特殊的测试只是确保验证通过。不幸的是,scalatest无法找到我的测试。

相关的pom插件:

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <configuration>
                <skipTests>true</skipTests>
            </configuration>
        </plugin>
        <!-- enable scalatest -->
        <plugin>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest-maven-plugin</artifactId>
            <version>1.0</version>
            <configuration>
                <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
                <wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
            </configuration>
            <executions>
                <execution>
                    <id>test</id>
                    <goals>
                        <goal>test</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

这是测试类:

class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter {
    private var schemaStrategy: SchemaStrategy = _
    private var dataReader: DataFrameReader = _

    before {
        val sqlContext = new SQLContext(sc)
        import sqlContext._
        import sqlContext.implicits._

        val dataInReader = sqlContext.read.format("com.databricks.spark.csv")
                                            .option("header", "true")
                                            .option("nullValue", "")
        schemaStrategy = SchemaStrategyChooser("dim_state_test_proficiency")
        dataReader = schemaStrategy.applySchema(dataInReader)
    }

    "Proficiency Validation" should "pass with the CSV file proficiency-valid.csv" in {
        val dataIn = dataReader.load("src/test/resources/proficiency-valid.csv")

        val valid: Try[DataFrame] = Try(schemaStrategy.validateCsv(dataIn))
        valid match {
            case Success(v) => ()
            case Failure(e) => fail("Validation failed on what should have been a clean file: ", e)
        }
    }
}

当我运行mvn test时,它无法找到任何测试并输出此消息:

[INFO] --- scalatest-maven-plugin:1.0:test (test) @ load-csv-into-db ---
[36mDiscovery starting.[0m
[36mDiscovery completed in 54 milliseconds.[0m
[36mRun starting. Expected test count is: 0[0m
[32mDiscoverySuite:[0m
[36mRun completed in 133 milliseconds.[0m
[36mTotal number of tests run: 0[0m
[36mSuites: completed 1, aborted 0[0m
[36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m
[33mNo tests were executed.[0m

更新
使用:

<suites>com.cainc.data.etl.schema.proficiency.ProficiencySchemaITest</suites>

而不是:

<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>

我可以让一个Test运行。显然,这并不理想。它可能会破坏wildcardSuites;我打算在GitHub上打开一张票,看看会发生什么。

5 个答案:

答案 0 :(得分:2)

这可能是因为项目路径中有一些空格字符。 删除项目路径中的空间,可以成功发现测试。 希望这有帮助。

答案 1 :(得分:1)

尝试将junit排除为传递依赖项。适合我。下面的示例,但请注意Scala和Spark版本特定于我的环境。

   <dependency>
        <groupId>com.holdenkarau</groupId>
        <artifactId>spark-testing-base_2.10</artifactId>
        <version>1.5.0_0.6.0</version>
        <scope>test</scope>
        <exclusions>
            <!-- junit is not compatible with scalatest -->
            <exclusion>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
            </exclusion>
        </exclusion>
    </dependency>

答案 2 :(得分:1)

我无法找到测试的问题归结为从class文件中发现测试的事实,因此要使测试被发现,我需要在{{ 1}} <goal>testCompile</goal>

答案 3 :(得分:0)

和我一起,因为我没有使用以下插件:

&#13;
&#13;
<plugin>
        <groupId>org.scala-tools</groupId>
        <artifactId>maven-scala-plugin</artifactId>
        <executions>
          <execution>
            <goals>
              <goal>compile</goal>
              <goal>testCompile</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <scalaVersion>${scala.version}</scalaVersion>
          <args>
            <arg>-target:jvm-1.8</arg>
          </args>
        </configuration>
      </plugin>
&#13;
&#13;
&#13;

答案 4 :(得分:-1)

原因:无论何时运行mvn命令,Maven插件都不会编译测试代码。

解决方法:

使用您的IDE运行scala测试,这将编译测试代码并将其保存在目标目录中。下次您运行mvn test或任何内部触发maven测试周期的maven命令时,应运行scala测试