在使用带有多个参数的SparkUDF时遇到问题

时间:2019-01-21 14:23:51

标签: apache-spark-sql user-defined-functions

我试图通过在Spark UDF中作为参数传递,而使用SHA-256加密数据,但出现错误。请在下面找到程序片段和错误详细信息。

代码段:

package com.sample
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import java.security.MessageDigest
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.UserDefinedFunction
import javax.xml.bind.DatatypeConverter;
import org.apache.spark.sql.Column

object Customer {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Customer-data").setMaster("local[2]").set("spark.executor.memory", "1g");

    val sc = new SparkContext(conf)

    val spark = SparkSession.builder().config(sc.getConf).getOrCreate()
    //val hash_algm=sc.getConf.get("halgm")
    val hash_algm="SHA-256"

    val df = spark.read.format("csv").option("header", "true").load("file:///home/tcs/Documents/KiranDocs/Data_files/sample_data")
    spark.udf.register("encriptedVal1", encriptedVal)
    //calling encription UDF function
    //val resDF1 = df.withColumn(("ssn_number"), encriptedVal(df("customer_id"))).show()
    val resDF2 = df.withColumn(("ssn_number"), encriptedVal(array("customer_id", hash_algm))).show()


    println("data set"+resDF2)   


    sc.stop()

  }
   def encriptedVal = udf((s: String,s1:String) => {
    val digest = MessageDigest.getInstance(s1)
    val hash = digest.digest(s.getBytes("UTF-8"))
    DatatypeConverter.printHexBinary(hash)
  })

}

错误详细信息如下:

  

线程“主”中的异常2019-01-21 19:42:48 INFO SparkContext:54-   从关机钩子java.lang.ClassCastException中调用stop():   com.sample.Customer $$ anonfun $ encriptedVal $ 1无法转换为   scala.Function1位于   org.apache.spark.sql.catalyst.expressions.ScalaUDF。(ScalaUDF.scala:104)     在   org.apache.spark.sql.expressions.UserDefinedFunction.apply(UserDefinedFunction.scala:85)     在com.sample.Customer $ .main(Customer.scala:26)处   com.sample.Customer.main(Customer.scala)

1 个答案:

答案 0 :(得分:0)

这里的问题是如何调用已定义的UDF。您应该像下面这样使用它:

val resDF1 = df.withColumn(("ssn_number"), encriptedVal(df.col("customer_id"), lit(hash_algm)))

因为它接受两个Column对象(两个Column必须是您的UDF中定义的String类型)。