tl; dr - 有没有办法提高同时读写多处理队列的速度?
我有一个处理审计数据的应用程序。把它想象成一个syslog中继。它接收数据,解析它,然后向前发送事件。事件发生率可能很高 - 我每秒拍摄15,000个事件(EPS)。
in_queue = multiprocessing.Queue()
out_queue = multiprocessing.Queue()
in_queue
in_queue.put()
in_queue.get()
获取数据,处理数据,然后使用out_queue
out_queue.put()
out_queue
读取out_queue.get()
并通过TCP套接字连接向前发送数据我使用队列运行测试 - 我可以将 OR 拉动事件放入25,000 EPS的队列中。当多个解析进程(4)在写入数据时将数据从队列中拉出时,会发生减速。利率降至10,000欧元以下。我猜测底层管道,锁等是导致延迟的原因。
我读了管道,看起来它只支持2个端点。我需要将CPU密集型解析分解为多个proc。多处理内存共享等替代方法可以获得更好的结果吗?如何从队列中同时获得更好的同步.put()
和.get()
操作?
答案 0 :(得分:2)
鉴于您的性能需求,我认为您最好使用ZeroMQ或RabbitMQ这样的第三方邮件代理。我找到了一个比较多here的基准测试(尽管它并不完全符合您的用例)。性能差异巨大:
multiprocesing.Queue Results
1
2
3
python2 ./multiproc_with_queue.py
Duration: 164.182257891
Messages Per Second: 60907.9210414
0mq结果
1
2
3
python2 ./multiproc_with_zeromq.py
Duration: 23.3490710258
Messages Per Second: 428282.563744
我接受了这两个测试,并提供了更复杂的工作负载,因为multiprocessing.Queue
的一个好处是它为您处理序列化。这是新脚本:
<强> mult_queue.py 强>
import sys
import time
from multiprocessing import Process, Queue
def worker(q):
for task_nbr in range(1000000):
message = q.get()
sys.exit(1)
def main():
send_q = Queue()
Process(target=worker, args=(send_q,)).start()
msg = {
'something': "More",
"another": "thing",
"what?": range(200),
"ok": ['asdf', 'asdf', 'asdf']
}
for num in range(1000000):
send_q.put(msg)
if __name__ == "__main__":
start_time = time.time()
main()
end_time = time.time()
duration = end_time - start_time
msg_per_sec = 1000000 / duration
print "Duration: %s" % duration
print "Messages Per Second: %s" % msg_per_sec
multi_zmq.py
import sys
import zmq
from multiprocessing import Process
import time
import json
import cPickle as pickle
def worker():
context = zmq.Context()
work_receiver = context.socket(zmq.PULL)
work_receiver.connect("tcp://127.0.0.1:5557")
for task_nbr in range(1000000):
message = work_receiver.recv_pyobj()
sys.exit(1)
def main():
Process(target=worker, args=()).start()
context = zmq.Context()
ventilator_send = context.socket(zmq.PUSH)
ventilator_send.bind("tcp://127.0.0.1:5557")
msg = {
'something': "More",
"another": "thing",
"what?": range(200),
"ok": ['asdf', 'asdf', 'asdf']
}
for num in range(1000000):
ventilator_send.send_pyobj(msg)
if __name__ == "__main__":
start_time = time.time()
main()
end_time = time.time()
duration = end_time - start_time
msg_per_sec = 1000000 / duration
print "Duration: %s" % duration
print "Messages Per Second: %s" % msg_per_sec
输出:
dan@dan:~$ ./mult_zmq.py
Duration: 14.0204648972
Messages Per Second: 71324.3110935
dan@dan:~$ ./mult_queue.py
Duration: 27.2135331631
Messages Per Second: 36746.4229657