如何使用Spark Test指令为我的Spark应用程序编写单元测试

时间:2018-12-11 00:40:58

标签: unit-testing apache-spark

我尝试使用诸如withTable, withSQLConf withTempView之类的指令编写Spark单元测试,该指令用于Spark单元测试本身,例如         https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/JoinSuite.scala         https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/JoinSuite.scala

。但是,我无法使用它们,因为在导入许多库之后,它们仍然是未知的。我已经导入了所有这些

import org.apache.spark.sql.test.SharedSQLContext
import org.apache.spark.internal.config
import org.apache.spark.sql._
import org.apache.spark.sql.catalyst.TableIdentifier
import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, NoSuchPartitionException, NoSuchTableException, TempTableAlreadyExistsException}
import org.apache.spark.sql.catalyst.catalog._
import org.apache.spark.sql.catalyst.catalog.CatalogTypes.TablePartitionSpec
import org.apache.spark.sql.internal.SQLConf
import org.apache.spark.sql.internal.StaticSQLConf.CATALOG_IMPLEMENTATION
import org.apache.spark.sql.test.{SQLTestUtils, SharedSQLContext}
import org.apache.spark.sql.types._
import org.apache.spark.util.Utils
import org.scalatest.FunSuite
import org.apache.spark.sql.{AnalysisException, QueryTest}
import org.apache.spark.sql.catalyst.TableIdentifier
import org.apache.spark.sql.catalyst.catalog.CatalogTable
import org.apache.spark.sql.hive.test.TestHiveSingleton
import org.apache.spark.sql.test.SQLTestUtils
import org.apache.spark.util.Utils

我应该在构建sbt中添加什么? 另外,这些“指令”的实现/文档在哪里?

0 个答案:

没有答案