spark UDF不接受数组

时间:2020-10-10 17:57:12

标签: scala apache-spark

为什么Spark UDF函数不支持scala数组,而使用WrappedArray UDF可以正常工作? 在函数定义中,Array [Date]给出了Classcast异常。一旦更改为WrappedArray,它的效果就很好。

def getDate(listOfDate:mutable.WrappedArray[Date], packageSD:Date, durationrange:Int):Date = {
  var nextdate = packageSD.toLocalDate.plusDays(durationrange)
  var billdate:Date = null
  var mindays = durationrange
  var billingdate = listOfDate.map(rec=>
    {
      println("list date"+rec)
      var recdate = rec
      var daysDiff = Math.abs(ChronoUnit.DAYS.between(recdate.toLocalDate,nextdate)).toInt
      if(daysDiff<=mindays) {
        mindays = daysDiff
        billdate = recdate
      }
      println("prefst"+recdate)
      println("nextdate"+nextdate)
      println("billdate"+billdate)
      println("mindays"+mindays)
    }
  )
      return billdate
}

import org.apache.spark.sql.functions.udf
val udffn = udf(getDate _)

1 个答案:

答案 0 :(得分:0)

UDF需要Seq,而WrappedArraySeq,而Array不是Seq