加速多线程以将数据从python加载到sqlserver

时间:2019-06-11 13:01:44

标签: python multithreading

我正在尝试使用odbc将数据帧从python加载到sql服务器中。 这些表甚至没有超过100条记录,但是当我使用线程时,我的内核就死了。

代码:

def tbl1(dataframe):
    for index, row in dataframe.iterrows():
        cur.execute("INSERT INTO database.dbo.tablename1([x1], [x2]) values(?,?)", row['x1'], row['x2'])
        conn.commit()

def tbl2(dataframe):
    for index, row in dataframe.iterrows():
        cur.execute("INSERT INTO database.dbo.tablename2([y1], [y2]) values(?,?)", row['y1'], row['y2'])
        conn.commit()

if __name__ == "__main__": 
    client = MongoClient('x.x.x.x', y)
    db = client.databse
    t1 = threading.Thread(target=tbl1, args=('xx',)) 
    t2 = threading.Thread(target=tbl2, args=('yy',)) 
    # starting thread 1 
    t1.start()
    t2.start()
    # wait until thread 1 is completely executed 
    t1.join()
    t2.join()
    # both threads completely executed 
    print("Done!")

一旦我执行了main,我的内核就会死掉。有解决方案吗?

0 个答案:

没有答案