我对DataFrame
转换代码进行了两次集成测试(使用https://github.com/holdenk/spark-testing-base),并且在IntelliJ中单独运行时它们都运行良好。
但是,当我运行gradle构建时,对于第一个测试,我看到以下消息:
17/04/06 11:29:02 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
和
17/04/06 11:29:05 ERROR SparkContext: Error initializing SparkContext.
akka.actor.InvalidActorNameException: actor name [ExecutorEndpoint] is not unique!
和
java.lang.NullPointerException
at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
第二个测试中途运行并中止以下消息(此代码在实际集群BTW上正常运行):
org.apache.spark.SparkException: Job aborted due to stage failure: Task serialization failed: java.lang.NullPointerException
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
这里是完整构建输出的pastebin:https://pastebin.com/drG20kcB
如何一起运行我的spark集成测试?
谢谢!
PS:如果它可能有用,我使用gradle包装器(./gradlew clean build
)
答案 0 :(得分:0)
我需要这个:
test {
maxParallelForks = 1
}
但是,如果有一种方法可以在gradle中为特定的测试子集转换并行执行,我会更喜欢这种解决方案。
我在WordSpec BTW中使用ScalaTest。