从csv获取数据并计算平均值

时间:2019-09-18 11:16:28

标签: apache-spark pyspark pyspark-sql

首先,我需要使用python spark从csv文件计算一列的平均值,

我有一个代码:

 scSpark = SparkSession \
.builder \
.appName("Python Spark SQL basic example: Reading CSV file without mentioning schema") \
.config("spark.some.config.option", "some-value") \
.getOrCreate()

sdfData = scSpark.read.csv("document.csv", header=True, sep=",")
sdfData.show()

然后我要在屏幕上获取下一个数据:

   +---------+------+---------+------------------+
   |     Name| total| test val|             ratio|
   +---------+------+---------+------------------+
   |parimatch|     3|   test7 |0.6164045285312666|
   |parimatch|     4|   test6 |0.5829715240832467|
   |     leon|     3|   test5 |0.6164045285312666|
   |     leon|     4|   test4 |0.5829715240832467|
   |parimatch|     3|   test3 |0.6164045285312666|
   |parimatch|     4|    test |0.5829715240832467|
   +---------+------+---------+------------------+

如何通过火花计算平均比率?

1 个答案:

答案 0 :(得分:0)

Apache Spark具有执行此操作的平均功能:

import pyspark.sql.functions as f

average = sdfData.agg(avg(col("ratio")))