如何在scala中将参数传递给方法

时间:2019-03-08 15:20:17

标签: scala apache-spark dataframe apache-spark-sql

我正在尝试将第四个(targetFileCount)参数传递给以下方法

val config = ConfigFactory.load("market_opt_partition.properties")
val targetFileCount = (config.getInt(Code))
writeArray1.par.foreach {
  case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
}
object Utility {
    def write(sourceDf: DataFrame, path: String, toggle: String, targetFileCount:Int): Unit

但是我面临以下错误,

Error:(368, 12) constructor cannot be instantiated to expected type;
found   : (T1, T2, T3, T4)
required: (org.apache.spark.sql.DataFrame, String, String)
      case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)

Error:(368, 67) not found: value df
      case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)

请让我知道如何纠正该问题。

2 个答案:

答案 0 :(得分:1)

writeArray1包含org.apache.spark.sql.DataFrame, String, String中的元组3 因此,无法在4个参数上进行模式匹配。

另一个例子:

val l = List(5)
l.map { case (a, b) => a.toString }

也会产生相同的错误:

 error: constructor cannot be instantiated to expected type;
 found   : (T1, T2)
 required: Int

答案 1 :(得分:0)

如上所述,writeArray1.par包含3个org.apache.spark.sql.DataFrame,String,String的元组,因此无法在4个参数上进行模式匹配。

请如下使用。

val config = ConfigFactory.load("market_opt_partition.properties")
val targetFileCount = (config.getInt(Code))
writeArray1.par.foreach {
  case (df, path, tog) => Utility.write(df, path, tog, targetFileCount)
}
object Utility {
    def write(sourceDf: DataFrame, path: String, toggle: String, targetFileCount:Int): Unit