我是Python中的多线程新手,目前正在编写附加到csv文件的脚本。如果我要将多个线程提交给concurrent.futures.ThreadPoolExecutor
,那么将行附加到csv文件。如果附加是这些线程唯一与文件相关的操作,我该怎么做才能保证线程安全?
我的代码的简化版本:
with concurrent.futures.ThreadPoolExecutor(max_workers=3) as executor:
for count,ad_id in enumerate(advertisers):
downloadFutures.append(executor.submit(downloadThread, arguments.....))
time.sleep(random.randint(1,3))
我的线程类是:
def downloadThread(arguments......):
#Some code.....
writer.writerow(re.split(',', line.decode()))
我是否应该设置一个单独的单线程执行程序来处理写入,或者如果我只是附加,它是否会令人担心?
编辑:我应该详细说明,当写入操作发生时,下一次附加文件的时间间隔可能会有很大差异,我只关心在测试我的脚本时没有发生这种情况,我更愿意为这一点。
答案 0 :(得分:11)
我不确定csvwriter
是否是线程安全的。 documentation未指定,因此为了安全起见,如果多个线程使用同一个对象,则应使用threading.Lock
来保护使用情况:
# create the lock
import threading
csv_writer_lock = threading.Lock()
def downloadThread(arguments......):
# pass csv_writer_lock somehow
# Note: use csv_writer_lock on *any* access
# Some code.....
with csv_writer_lock:
writer.writerow(re.split(',', line.decode()))
话虽如此,downloadThread
向执行程序提交写入任务可能确实更优雅,而不是显式使用这样的锁。
答案 1 :(得分:2)
这里有一些代码,它还处理引起头痛的unicode问题:
def ensure_bytes(s):
return s.encode('utf-8') if isinstance(s, unicode) else s
class ThreadSafeWriter(object):
'''
>>> from StringIO import StringIO
>>> f = StringIO()
>>> wtr = ThreadSafeWriter(f)
>>> wtr.writerow(['a', 'b'])
>>> f.getvalue() == "a,b\\r\\n"
True
'''
def __init__(self, *args, **kwargs):
self._writer = csv.writer(*args, **kwargs)
self._lock = threading.Lock()
def _encode(self, row):
return [ensure_bytes(cell) for cell in row]
def writerow(self, row):
row = self._encode(row)
with self._lock:
return self._writer.writerow(row)
def writerows(self, rows):
rows = (self._encode(row) for row in rows)
with self._lock:
return self._writer.writerows(rows)
# example:
with open('some.csv', 'w') as f:
writer = ThreadSafeWriter(f)
writer.write([u'中文', 'bar'])
更详细的解决方案是here
答案 2 :(得分:1)
晚间聚会的注意事项:您可以通过让单个写入器从共享队列中使用数据,并由进行处理的线程将行推入队列,来以不同的方式处理此方法而不会锁定。 / p>
from threading import Thread
from queue import Queue
from random import randint
from concurrent.futures import ThreadPoolExecutor
# CSV writer setup goes here
queue = Queue()
def consume():
while True:
if not queue.empty():
i = queue.get()
# Row comes out of queue; CSV writing goes here
print(i)
if i == 4999:
return
consumer = Thread(target=consume)
consumer.setDaemon(True)
consumer.start()
def produce(i):
# Data processing goes here; row goes into queue
queue.put(i)
with ThreadPoolExecutor(max_workers=10) as executor:
for i in range(5000):
executor.submit(produce, i)
consumer.join()