我是pyspark的新人。 我在pyspark写了这段代码:
def filterOut2(line):
return [x for x in line if x != 2]
filtered_lists = data.map(filterOut2)
但是我收到了这个错误:
'list' object has no attribute 'map'
如何在PySpark中专门对我的数据执行map
操作,这种方式允许我将数据过滤到我的条件评估为真的那些值?
答案 0 :(得分:2)
map(filterOut2, data)
有效:
>>> data = [[1,2,3,5],[1,2,5,2],[3,5,2,8],[6,3,1,2],[5,3,2,5],[4,1,2,5] ]
... def filterOut2(line):
... return [x for x in line if x != 2]
... list(map(filterOut2, data))
...
[[1, 3, 5], [1, 5], [3, 5, 8], [6, 3, 1], [5, 3, 5], [4, 1, 5]]
map()只需1个参数(2个给定)
您似乎重新定义了map
。试试__builtin__.map(filterOut2, data)
。
或者,使用列表理解:
>>> [filterOut2(line) for line in data]
[[1, 3, 5], [1, 5], [3, 5, 8], [6, 3, 1], [5, 3, 5], [4, 1, 5]]