我是新来的火花,我想计算每列的空率,(我有200列),我的功能如下:
def nullCount(dataFrame: DataFrame): Unit = {
val args = dataFrame.columns.length
val cols = dataFrame.columns
val d=dataFrame.count()
println("Follows are the null value rate of each columns")
for (i <- Range(0,args)) {
var nullrate = dataFrame.rdd.filter(r => r(i) == (-900)).count.toDouble / d
println(cols(i), nullrate)
}
}
但我发现它太慢了,有没有更有效的方法来做到这一点?
答案 0 :(得分:2)
改编自this answer的zero323:
import org.apache.spark.sql.functions.{col, count, when}
df.select(df.columns.map(c => (count(c) / count("*")).alias(c)): _*)
with -900:
df.select(df.columns.map(
c => (count(when(col(c) === -900, col(c))) / count("*")).alias(c)): _*)