需要为Scala代码编写JUnit测试

时间:2017-07-05 07:16:45

标签: java scala junit

我是Scala和Spark的新手。我需要为以下字数计划编写JUnit:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark._

object SparkWordCount {

  def main(args: Array[String]) {

    meth()

  }

  def meth() {

    //Spark Config object having cluster information
    val sparkConfig = new SparkConf()
      .setAppName("SparkWordCount")
      .setMaster("local")

    val sc = new SparkContext(sparkConfig)

    val input = sc.textFile("C:\\SparkW\\input\\inp.txt")
    val count = input.flatMap(line ⇒ line.split(" "))
      .map(word ⇒ (word, 1))
      .reduceByKey(_ + _)
    count.saveAsTextFile("outfile")
    System.out.println("OK");

  }

}

如何使用JUnit为此编写测试?

2 个答案:

答案 0 :(得分:0)

您可以使用众多scala规范/测试库中的一个

测试看起来像

import org.scalatest.{FlatSpec, Matchers}

class WordCountSpec extends FlatSpec with Matchers {
  "wordCount" should "return an empty output for an empty input" in
withLocalSparkContext { context =>
  val lines = context.emptyRDD[String]
  val words = WordCount.wordCount(lines).collect()
  words should be (Seq.empty)
}}

选中此示例以获取完整示例Spec Class

有针对scalatest规范等的示例程序,有搜索

答案 1 :(得分:0)

同时我也找到了另一个解决方案,所以想分享它

import java.io.File

import org.junit。{Assert,Test}

/ **   *由usharani于07-07-2017创建。   * / class SparkWordCountTest {

@Test   def test():单位= {     val line1 = new file(" outfile / part-00000")     val line2 = new File(" C:/Users/usharani/IdeaProjects/Practice/src/test/scala/inputs.txt")     val myString = scala.io.Source.fromFile(line1).getLines.mkString     val myString1 = scala.io.Source.fromFile(line2).getLines.mkString     的println(MyString的)     的println(myString1)     Assert.assertEquals(myString,myString1)

} }