我正在尝试使用Spark 2.1中的新 to_json 函数将Map类型列转换为JSON。以下代码段:
import org.apache.spark.sql.functions.to_json
val df = spark.sql("SELECT *, to_json(Properties) AS Prop, to_json(Measures) AS Meas FROM tempTable")
但是,我收到以下错误。什么想法可能是错的?
org.apache.spark.sql.AnalysisException: Undefined function: 'to_json'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 10