通过Python向SQLite插入数十亿数据

时间:2019-01-28 08:07:22

标签: python sqlite

我想在sqlite db文件中插入数十亿的值(汇率)。我想使用线程,因为它需要很多时间,但是线程池循环会多次执行相同的第n个元素。我在方法的开头有一条打印语句,它打印多次而不是一次。

pool = ThreadPoolExecutor(max_workers=2500)

def gen_nums(i, cur):
    global x
    print('row number', x, ' has started')
    gen_numbers = list(mydata)
    sql_data = []
    for f in gen_numbers:
        sql_data.append((f, i, mydata[i]))
    cur.executemany('INSERT INTO numbers (rate, min, max) VALUES (?, ?, ?)', sql_data)
    print('row number', x, ' has finished')
    x += 1


with conn:
    cur = conn.cursor()
    for i in mydata:
        pool.submit(gen_nums, i, cur)

pool.shutdown(wait=True)

,输出为:

row number 1  has started
row number 1  has started
row number 1  has started
row number 1  has started
row number 1  has started
row number 1  has started
row number 1  has started
...

1 个答案:

答案 0 :(得分:1)

使用生成器表达式将数据即时划分为多个块,在事务内进行插入。

Here您的代码看起来如何。

此外,sqlite可以import CSV files.

Sqlite可以执行数十个thousands of inserts per second,只需通过用BEGIN和COMMIT包围插入即可确保在单个事务中完成所有这些操作。 (executemany()自动执行此操作。)

和往常一样,在知道速度会成为问题之前不要进行优化。首先测试最简单的解决方案,只有在速度不可接受时才进行优化。