使用python-redis和多处理模块

时间:2016-02-18 16:34:58

标签: python redis multiprocessing connection-pooling

我是python redis模块的新手,我正在寻找一种在使用python多处理模块时池连接的有效方法。我的代码如下:

gridCoord

这里我基本上想在import multiprocessing as mp import Queue import os import time import redis def _process_messages_parallel(messages): processes=[mp.Process(target=_process_message, args=[message]) for message in messages] for process in processes: process.daemon=False process.start() for process in processes: process.join() def _process_message(message): organization=message['organization'] #get value associated with `organization` key from redis result=get_val_from_redis(organization) #process result #assume local redis instance for now. def get_val_from_redis(organization,host='localhost',port='6379'): conn=redis.StrictRedis(host=host,port=port,db=db,retry_on_timeout=retry_on_timeout) return conn.get(organization) if __name__=="__main__": messages=[{'organization':'org1'},{'organization':'org2'},{'organization':'org3'},...{'organization':'orgN'}] process_articles=mp.Process(target=_process_messages_parallel,args=[messages]) process_articles.daemon=False process_articles.start() process_articles.join() 函数创建的不同进程之间重用相同的连接池。有人可以建议一个好方法吗?一个简单的解决方案可能是创建模块范围redis连接池,或函数范围redis连接池,但我关心的是如何在不同进程之间共享python模块/或python函数。如果为每个进程创建了一个单独的对象副本,那么它是否胜过连接池的目的?

0 个答案:

没有答案