我有一个python spark文件:
from pyspark.sql import Row
from pyspark.sql import SparkSession
from pyspark.sql import SQLContext
import pyspark.sql.functions as psf
import json
spark = SparkSession \
.builder \
.appName("Hello") \
.config("World") \
.getOrCreate()
sc = spark.sparkContext
sqlContext = SQLContext(sc)
ratings = spark.createDataFrame(
sc.textFile("transactions.json").map(lambda l: json.loads(l))
)
ratings.registerTempTable("ratings")
final_df = sqlContext.sql("select * from ratings");
final_df.show(20,False)
这会生成输出:
+--------+-------------------+------------+----------+-------------+-------+
|click_id| created_at| ip|product_id|product_price|user_id|
+--------+-------------------+------------+----------+-------------+-------+
| 123|2016-10-03 12:50:33| 10.10.10.10| 98373| 220.5| 1|
| 124|2017-02-03 11:51:33| 10.13.10.10| 97373| 320.5| 1|
| 125|2017-10-03 12:52:33| 192.168.2.1| 96373| 20.5| 1|
| 126|2017-10-03 13:50:33|172.16.11.10| 88373| 220.5| 2|
| 127|2017-10-03 13:51:33| 10.12.15.15| 87373| 320.5| 2|
| 128|2017-10-03 13:52:33|192.168.1.10| 86373| 20.5| 2|
| 129|2017-08-03 14:50:33| 10.13.10.10| 78373| 220.5| 3|
| 130|2017-10-03 14:51:33| 12.168.1.60| 77373| 320.5| 3|
| 131|2017-10-03 14:52:33| 10.10.30.30| 76373| 20.5| 3|
+--------+-------------------+------------+----------+-------------+-------+
我想只将IP列值存储到列表中,以便将每个值作为参数传递给名为def find_ip_city(ip_address)
的函数,该函数返回IP的城市。
火花是否有办法将数据框存储到列表中并将列表的每个值作为参数传递?
答案 0 :(得分:0)
认为您可以将函数find_ip_city(ip_address)用作UDF并直接传递列值,而不是将它们作为列表收集。请找到伪代码,
def find_ip_city(ip_address):
pass
from pyspark.sql.functions import udf
cityUDF = udf(find_ip_city,StringType())
final_df.withColumn('city',cityUDF(final_df['ip'])).show()
希望这有帮助。!