使用圆形函数转换pyspark数据框列不起作用(pyspark)

时间:2019-07-09 15:44:58

标签: pyspark rounding

我想用一个已经存在的列的舍入值创建一个火花数据框的新列。 “ em”列的类型为float。

我已经检查了各种帖子,但无法弄清楚。包括以下链接:Trouble With Pyspark Round Function

这是我的代码:

import pyspark.sql.functions as f
df = df.withColumn("rounded", f.round(f.col("em"), 3))
df.show()

新生成的列“四舍五入”与原始列“ em”完全相同。我正在Cloudera群集上的Zeppelin Notebook中使用pyspark 2.3.0版。

更新:

尝试了以下内容:

%pyspark
s2_em = s2.select('em')
print "Datatype:", type(s2_em)
s2_em.printSchema()
s2_em = s2_em.withColumn('rounded', f.round(f.col('em'), 3))
s2_em = s2_em.withColumn('plus', f.col('em') + f.col('rounded'))
s2_em = s2_em.withColumn('minus', f.col('em') - f.col('rounded'))
s2_em = s2_em.withColumn('multiplication', f.col('em') * f.col('rounded'))
s2_em.limit(5).show()

这将产生以下结果,但四舍五入仍然无效。还有其他提示吗?:

Datatype: <class 'pyspark.sql.dataframe.DataFrame'>
root |-- em: float (nullable = true)
+------------+------------+------------+-----+--------------+
|          em|     rounded|        plus|minus|multiplication|
+------------+------------+------------+-----+--------------+
|1.14209626E9|1.14209626E9|2.28419251E9| 0.0|   1.3043839E18|
|1.25046528E9|1.25046528E9|2.50093056E9| 0.0|  1.56366345E18|
| 9.5720672E8| 9.5720672E8|1.91441344E9| 0.0|   9.1624469E17|
| 1.1392649E9| 1.1392649E9|2.27852979E9| 0.0|  1.29792455E18|
|1.29539699E9|1.29539699E9|2.59079398E9| 0.0|  1.67805334E18|
+------------+------------+------------+-----+--------------+

1 个答案:

答案 0 :(得分:1)

使用相同的代码进行测试,它可以完美运行,请参见下面的示例:

import pyspark.sql.functions as f
from pyspark import Row
from pyspark.shell import spark

df = spark.createDataFrame([
    Row(em=3.45631),
    Row(em=2.82945),
    Row(em=7.76261),
    Row(em=2.76790)
])

df = df.withColumn('rounded', f.round(f.col('em'), 3))
df.show()

输出:

+-------+-------+                                                               
|     em|rounded|
+-------+-------+
|3.45631|  3.456|
|2.82945|  2.829|
|7.76261|  7.763|
| 2.7679|  2.768|
+-------+-------+

更新

实际上,其浮点值包含指数E9E8。例如,值1.14209626E9等于1142096260

要对其进行四舍五入,必须将值除以1e9,然后调用round函数。

请参见以下示例:

import pyspark.sql.functions as f
from pyspark import Row
from pyspark.shell import spark

df = spark.createDataFrame([
    Row(em=1.14209626E9),
    Row(em=1.25046528E9),
    Row(em=9.5720672E8)
])

df = df.withColumn('rounded', (f.round(f.col('em') / 1e9, 3)) * 1e9)
df.show()

输出:

+------------+-------+
|          em|rounded|
+------------+-------+
|1.14209626E9|1.142E9|
|1.25046528E9| 1.25E9|
| 9.5720672E8| 9.57E8|
+------------+-------+