我正在尝试构建一个python脚本,该脚本在大量数据集中有一个工作进程池(使用mutiprocessing.Pool)。
我希望每个进程都有一个唯一的对象,可以在该进程的多个执行中使用。
Psudo代码:
def work(data):
#connection should be unique per process
connection.put(data)
print 'work done with connection:', connection
if __name__ == '__main__':
pPool = Pool() # pool of 4 processes
datas = [1..1000]
for process in pPool:
#this is the part i'm asking about // how do I really do this?
process.connection = Connection(conargs)
for data in datas:
pPool.apply_async(work, (data))
答案 0 :(得分:1)
我觉得这样的事情应该有效(未经测试)
def init(*args):
global connection
connection = Connection(*args)
pPool = Pool(initializer=init, initargs=conargs)
答案 1 :(得分:1)
直接创建mp.Process
es(不使用mp.Pool
)可能最简单:
import multiprocessing as mp
import time
class Connection(object):
def __init__(self,name):
self.name=name
def __str__(self):
return self.name
def work(inqueue,conn):
name=mp.current_process().name
while 1:
data=inqueue.get()
time.sleep(.5)
print('{n}: work done with connection {c} on data {d}'.format(
n=name,c=conn,d=data))
inqueue.task_done()
if __name__ == '__main__':
N=4
procs=[]
inqueue=mp.JoinableQueue()
for i in range(N):
conn=Connection(name='Conn-'+str(i))
proc=mp.Process(target=work,name='Proc-'+str(i),args=(inqueue,conn))
proc.daemon=True
proc.start()
datas = range(1,11)
for data in datas:
inqueue.put(data)
inqueue.join()
产量
Proc-0: work done with connection Conn-0 on data 1
Proc-1: work done with connection Conn-1 on data 2
Proc-3: work done with connection Conn-3 on data 3
Proc-2: work done with connection Conn-2 on data 4
Proc-0: work done with connection Conn-0 on data 5
Proc-1: work done with connection Conn-1 on data 6
Proc-3: work done with connection Conn-3 on data 7
Proc-2: work done with connection Conn-2 on data 8
Proc-0: work done with connection Conn-0 on data 9
Proc-1: work done with connection Conn-1 on data 10
请注意Proc
号码每次都对应相同的Conn
号码。
答案 2 :(得分:0)
处理本地存储非常容易实现为映射容器,对于任何其他从Google到这里的人来说,寻找类似的东西(请注意,这是Py3,但很容易转换为2的语法(仅继承自object
):
class ProcessLocal:
"""
Provides a basic per-process mapping container that wipes itself if the current PID changed since the last get/set.
Aka `threading.local()`, but for processes instead of threads.
"""
__pid__ = -1
def __init__(self, mapping_factory=dict):
self.__mapping_factory = mapping_factory
def __handle_pid(self):
new_pid = os.getpid()
if self.__pid__ != new_pid:
self.__pid__, self.__store = new_pid, self.__mapping_factory()
def __delitem__(self, key):
self.__handle_pid()
return self.__store.__delitem__(key)
def __getitem__(self, key):
self.__handle_pid()
return self.__store.__getitem__(key)
def __setitem__(self, key, val):
self.__handle_pid()
return self.__store.__setitem__(key)
查看更多@ https://github.com/akatrevorjay/pytutils/blob/develop/pytutils/mappings.py
答案 3 :(得分:-1)
你想让一个物体驻留在共享内存中,对吗?
Python 在其标准库中有一些支持,但它有点差。据我所知,只能存储整数和其他一些原始类型。
尝试POSH(Python对象共享):http://poshmodule.sourceforge.net/