不支持的文字类型类scala.runtime.BoxedUnit

时间:2018-11-19 12:37:52

标签: scala apache-spark-sql datastax databricks

我正尝试过滤从oracle读取的数据框的列,如下所示

import org.apache.spark.sql.functions.{col, lit, when}

val df0  =  df_org.filter(col("fiscal_year").isNotNull())

当我这样做时,我得到以下错误提示:

java.lang.RuntimeException: Unsupported literal type class scala.runtime.BoxedUnit ()
at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:77)
at org.apache.spark.sql.catalyst.expressions.Literal$$anonfun$create$2.apply(literals.scala:163)
at org.apache.spark.sql.catalyst.expressions.Literal$$anonfun$create$2.apply(literals.scala:163)
at scala.util.Try.getOrElse(Try.scala:79)
at org.apache.spark.sql.catalyst.expressions.Literal$.create(literals.scala:162)
at org.apache.spark.sql.functions$.typedLit(functions.scala:113)
at org.apache.spark.sql.functions$.lit(functions.scala:96)
at org.apache.spark.sql.Column.apply(Column.scala:212)
at com.snp.processors.BenchmarkModelValsProcessor2.process(BenchmarkModelValsProcessor2.scala:80)
at com.snp.utils.Utils$$anonfun$getAllDefinedProcessors$1.apply(Utils.scala:30)
at com.snp.utils.Utils$$anonfun$getAllDefinedProcessors$1.apply(Utils.scala:30)
at com.sp.MigrationDriver$$anonfun$main$6$$anonfun$apply$2.apply(MigrationDriver.scala:140)
at com.sp.MigrationDriver$$anonfun$main$6$$anonfun$apply$2.apply(MigrationDriver.scala:140)
at scala.Option.map(Option.scala:146)
at com.sp.MigrationDriver$$anonfun$main$6.apply(MigrationDriver.scala:138)
at com.sp.MigrationDriver$$anonfun$main$6.apply(MigrationDriver.scala:135)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.MapLike$DefaultKeySet.foreach(MapLike.scala:174)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at com.sp.MigrationDriver$.main(MigrationDriver.scala:135)
at com.sp.MigrationDriver.main(MigrationDriver.scala)

有什么主意我在这里做错什么以及如何解决这个问题?

2 个答案:

答案 0 :(得分:2)

只需删除函数中的括号即可:

来自: val df0 = df_org.filter(col("fiscal_year").isNotNull())
到:
val df0 = df_org.filter(col("fiscal_year").isNotNull)

答案 1 :(得分:0)

尝试在过滤器中的isNull()中删除()。