迭代python列表并求和相等的项

时间:2016-07-15 22:06:32

标签: python json

考虑到我后端的端点,会返回以下响应:

class Arc_Edges_Data(Resource):
  def get(self):
        #Connect to databse
        conn = connectDB()
        cur = conn.cursor()
        #Perform query and return JSON data
        try:
          cur.execute("select json_build_object('source', start_location, 'target', end_location, 'frequency', 1) from trips")
        except:
          print("Error executing select")
        ArcList = list (i[0] for i in cur.fetchall())
        return ArcList

每次旅行的频率应始终为1。所以这个ArcList会产生这样的响应:

[
    {
        "frequency": 1, 
        "source": "c", 
        "target": "c"
    }, 
    {
        "frequency": 1, 
        "source": "a", 
        "target": "b"
    }, {
        "frequency": 1, 
        "source": "a", 
        "target": "b"
    }, ...
]

如何迭代此响应并对具有相同sourcetarget的项目求和?在这种情况下,结果列表将只有一对源/目标与" a"和" b",但由于总和,频率为2。

我知道对于Javascript我可以使用类似Array.reduce的内容,但我认为它不存在于Python中。

2 个答案:

答案 0 :(得分:2)

这个怎么样?

import collections

data = [
    {
        "frequency": 1, 
        "source": "c", 
        "target": "c",
    }, 
    {
        "frequency": 1, 
        "source": "a", 
        "target": "b",
    },
    {
        "frequency": 1, 
        "source": "a", 
        "target": "b",
    },
]

counter = collections.Counter()

for datum in data:
    counter[(datum['source'], datum['target'])] += datum['frequency']

print(counter)

# Output:
# Counter({('a', 'b'): 2, ('c', 'c'): 1})

哦,如果您想再次将数据放回相同的格式,请添加以下代码:

newdata = [{
    'source': k[0],
    'target': k[1],
    'frequency': v,
} for k, v in counter.items()]

print(newdata)

# Output:
# [{'frequency': 1, 'target': 'c', 'source': 'c'}, {'frequency': 2, 'target': 'b', 'source': 'a'}]

答案 1 :(得分:0)

你可以这样做:

r = {}
for d in ArcList:
    key = (d['source'], d['target'])
    r[key] = r.setdefault(key, 0) + d['frequency']
return [{'source': k[0], 'target': k[1], 'frequency': v} for k, v in r.items()]

如果您想保留项目的原始排序:

from collections import OrderedDict
r = OrderedDict()
# The rest of the solution is the same
...