spark dataframe null value count

时间:2017-09-07 05:24:38

标签: apache-spark null spark-dataframe

我是新来的火花,我想计算每列的空率,(我有200列),我的功能如下:

def nullCount(dataFrame: DataFrame): Unit = {
val args = dataFrame.columns.length
val cols = dataFrame.columns
val d=dataFrame.count()
println("Follows are the null value rate of each columns")
for (i <- Range(0,args)) {
  var nullrate = dataFrame.rdd.filter(r => r(i) == (-900)).count.toDouble / d
  println(cols(i), nullrate)
}

}

但我发现它太慢了,有没有更有效的方法来做到这一点?

1 个答案:

答案 0 :(得分:2)

改编自this answerzero323

import org.apache.spark.sql.functions.{col, count, when}

df.select(df.columns.map(c => (count(c) / count("*")).alias(c)): _*)

with -900:

df.select(df.columns.map(
  c => (count(when(col(c) === -900, col(c))) / count("*")).alias(c)): _*)