我有一个生成无限量数据对象的类。该类具有此生成数据的子进程。由于此子流程需要使用工作池,因此它不具有守护功能。现在问题是这个进程永远不会完成它的工作(因为这是它的目的),所以当主进程完成时它将无法加入,因此python永远不会退出。因此,我必须以某种方式在主要结束时可靠地终止它。事实证明这比我想象的要复杂得多,因为我不能依赖类的析构函数,并且python进程在其join方法上默认不会超时。一个hacky解决方法是实现一个在最后调用的清理函数并终止worker,但这显然不是一个好的解决方案,因为很容易忘记调用它。您有什么建议可以解决这个问题吗? 问候, 费边
这是一个简约的例子:
from builtins import range
from builtins import object
from multiprocessing import Process
from multiprocessing import Queue
from multiprocessing import Pool
from collections import deque
def transform(x):
return x / 10
class SomeName(object):
def __init__(self):
self.was_started = False
def start(self):
print("started")
def produce(target_queue, num_processes):
print("producer started")
ctr = 0
pool = Pool(num_processes)
results = deque()
for i in range(num_processes):
item = ctr
ctr += 1
results.append(pool.apply_async(transform, args=(item,)))
while True:
target_queue.put(results.popleft().get())
item = ctr
ctr += 1
results.append(pool.apply_async(transform, args=(item,)))
self.queue = Queue(4)
self.process = Process(target=produce, args=(self.queue, 4))
self.process.daemon = False
self.process.start()
self.was_started = True
def __next__(self):
if not self.was_started:
self.start()
return self.queue.get()
if __name__ == "__main__":
a = SomeName()
[print(next(a)) for _ in range(20)]
答案 0 :(得分:0)
考虑终止热键。还要考虑注册工作线程。然后有一个杀人,停止所有工人的命令。
如果您正在寻找一个杀死所有线程的终端解决方案,那么为线程提供一个唯一的名称,这样可以很容易地从命令行中获取它们并对所有线程运行kill。
答案 1 :(得分:0)
一种可能的解决方案是: (不确定这是否太过于hacky)
from builtins import range
from builtins import object
from multiprocessing import Process
from multiprocessing import Queue
from multiprocessing import Pool
from collections import deque
def transform(x):
return x / 10
class ProcessTerminateOnJoin(Process):
def join(self, timeout=None):
self.terminate()
super(ProcessTerminateOnJoin, self).join(0.01)
class SomeName(object):
def __init__(self):
self.was_started = False
def start(self):
print("started")
def produce(target_queue, num_processes):
print("producer started")
ctr = 0
pool = Pool(num_processes)
results = deque()
for i in range(num_processes):
item = ctr
ctr += 1
results.append(pool.apply_async(transform, args=(item,)))
while True:
target_queue.put(results.popleft().get())
item = ctr
ctr += 1
results.append(pool.apply_async(transform, args=(item,)))
self.queue = Queue(4)
self.process = ProcessTerminateOnJoin(target=produce, args=(self.queue, 4))
self.process.daemon = False
self.process.start()
self.was_started = True
def __next__(self):
if not self.was_started:
self.start()
return self.queue.get()
if __name__ == "__main__":
a = SomeName()
[print(next(a)) for _ in range(20)]