psycopg2-将字典列表插入PosgreSQL数据库。处决太多?

时间:2019-03-01 14:32:19

标签: python postgresql psycopg2

我正在将字典列表插入PostgreSQL数据库。该列表将快速增长,字典值(列)的数量大约为30。隐含数据:

projects = [
{'name': 'project alpha', 'code': 12, 'active': True},
{'name': 'project beta', 'code': 25, 'active': True},
{'name': 'project charlie', 'code': 46, 'active': False}
]

使用以下代码将数据插入PostgreSQL数据库中确实可行(如在answer中一样),但是我担心执行太多查询。

for project in projects:
    columns = project.keys()
    values = project.values()

    query = """INSERT INTO projects (%s) VALUES %s;"""

    # print(cursor.mogrify(query, (AsIs(','.join(project.keys())), tuple(project.values()))))

    cursor.execute(query, (AsIs(','.join(columns)), tuple(values)))

conn.commit()

有更好的做法吗?提前非常感谢您的帮助!

3 个答案:

答案 0 :(得分:1)

答案 1 :(得分:1)

使用execute_values()在单个查询中插入数百行。

import psycopg2
from psycopg2.extras import execute_values

# ...

projects = [
{'name': 'project alpha', 'code': 12, 'active': True},
{'name': 'project beta', 'code': 25, 'active': True},
{'name': 'project charlie', 'code': 46, 'active': False}
]

columns = projects[0].keys()
query = "INSERT INTO projects ({}) VALUES %s".format(','.join(columns))

# convert projects values to sequence of seqeences
values = [[value for value in project.values()] for project in projects]

execute_values(cursor, query, values)
conn.commit()

答案 2 :(得分:0)

另一个不需要为字典列表进行大量数据处理的高性能选项是 execute_batch(psycopg2 2.7 版中的新功能)。

例如:

import psycopg2
from psycopg2.extras import execute_batch

projects = [{'name': 'project alpha', 'code': 12, 'active': True}, ...]
query = "INSERT INTO projects VALUES (%(name)s, %(code)s, %(active)s)"

execute_batch(cursor, query, values)
conn.commit()

https://www.psycopg.org/docs/extras.html#psycopg2.extras.execute_batch