无法在Spark-2.2.0上运行单元测试(scalatest) - Scala-2.11.8

时间:2017-08-22 11:37:57

标签: scala apache-spark sbt apache-spark-sql scalatest

无法使用spark-2.2.0

上下文运行运行scalatest

堆栈跟踪:

  

导致中止运行的异常或错误:org.apache.spark.sql.test.SharedSQLContext.eventually(Lorg / scalatest / concurrent / PatienceConfiguration $ Timeout; Lscala / Function0; Lorg / scalatest / concurrent / AbstractPatienceConfiguration $ PatienceConfig ;)Ljava /郎/对象;   java.lang.NoSuchMethodError:org.apache.spark.sql.test.SharedSQLContext.eventually(Lorg / scalatest / concurrent / PatienceConfiguration $ Timeout; Lscala / Function0; Lorg / scalatest / concurrent / AbstractPatienceConfiguration $ PatienceConfig;)Ljava / lang / Object ;       at org.apache.spark.sql.test.SharedSQLContext $ class.afterEach(SharedSQLContext.scala:92)       at testUtils.ScalaTestWithContext1.afterEach(ScalaTestWithContext1.scala:7)       在org.scalatest.BeforeAndAfterEach $$ anonfun $ 1.apply $ mcV $ sp(BeforeAndAfterEach.scala:234)

示例代码:

  import org.apache.spark.sql.SparkSession
  import testUtils.ScalaTestWithContext1

  class SampLeTest extends ScalaTestWithContext1 {
  override protected def spark: SparkSession = ???

     test("test") {
        1 == 1 shouldBe true
     }
  }

ScalaTestWithContext1.scala

  import org.apache.spark.sql.QueryTest
  import org.apache.spark.sql.test.SharedSQLContext
  import org.scalatest.{BeforeAndAfterAll, Matchers}

  abstract class ScalaTestWithContext extends QueryTest with SharedSQLContext with Matchers with BeforeAndAfterAll{}

build.sbt:

name := "test"
version := "1.0"
scalaVersion := "2.11.11"

parallelExecution in Test := false

libraryDependencies ++= Seq(
  "org.scala-lang" % "scala-library" % "2.11.11" % "provided",
  "org.apache.spark" %% "spark-core" % "2.2.0",
  "org.apache.spark" %% "spark-sql" % "2.2.0",
  "org.apache.spark" %% "spark-catalyst" % "2.2.0",
  "org.apache.spark" %% "spark-core" % "2.2.0" % "test" classifier 
"tests",
  "org.apache.spark" %% "spark-sql" % "2.2.0" % "test" classifier 
"tests",
  "org.apache.spark" %% "spark-catalyst" % "2.2.0" % "test" classifier 
"tests",
  "org.scalatest" %% "scalatest" % "3.0.1" % "test"
) 

ScalaTestWithContext1类扩展了SharedSQLContext和所有必需的特征。

先谢谢。

2 个答案:

答案 0 :(得分:0)

我遇到了类似的问题。对我有用的解决方案是使用Scalatest的版本dcast(dt[, .N,.(Name, Number)][, perc := 100*N/sum(N), Name], Name ~ Number, value.var = 'perc', fill = 0, drop = FALSE)[, (2:9) := lapply(Reduce(`+`, .SD, accumulate = TRUE), function(x) paste0(x, "%")), .SDcols = -1][] # Name 1 2 3 4 5 6 7 8 #1: A 25% 25% 75% 100% 100% 100% 100% 100% #2: B 40% 40% 40% 60% 80% 80% 80% 100% 而不是2.2.6版本。 Maven repository还在“#34; Test Dependencies"”部分中显示了正确的依赖关系。

答案 1 :(得分:0)

类似as already pointed out,请检出Spark's github repository中的pom.xml文件,以确保您使用的版本相同。

可能有更好的解决方案,例如与Spark合并或覆盖sbt首选的scalatest版本,但截至2019年12月,Spark 2.4.4正在使用Scalatest 3.0.8,这是最近的。