将字典列表中的数据批量插入到Postgresql数据库中?

时间:2018-07-20 15:55:11

标签: python list dictionary

例如:

books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]

我需要通过仅取“作者”,“价格”将每个字典插入已创建的表中 我想将10万条记录插入表中。 现在,我要做的是遍历字典列表,并获取所需的键/值对,并一一插入

def insert_books(self, val):
    cur = self.con.cursor()
    sql = """insert into testtable values {}""".format(val)
    cur.execute(sql)
    self.con.commit()
    cur.close()

for i in books:
    result = i['author'],i['price']
    db_g.insert_books(result)   #db_g is class - connection properties

那么有没有一种更快,更轻松的方式来一次批量插入10k之类的数据?

1 个答案:

答案 0 :(得分:0)

我认为通过转储整个数据帧进行批量插入会更快。.Why Bulk Import is faster than bunch of INSERTs?

import sqlalchemy

def db_conn():
    connection = sqlalchemy.create_engine(//connection string)
    return connection 


books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]

df_to_ingest = pd.DataFrame(books)
df_to_ingest = df_to_ingest([['author', 'price']])

df_to_ingest('tablename', db_conn(), if_exists='append', index=False)

希望这会有所帮助