对于循环python信息

时间:2016-06-22 03:02:02

标签: python for-loop while-loop multiprocessing pooling

继续交易,我的数据库中有一些条目。我打电话给django:

variables = Variable.objects.order_by('foo').values('foo')

然后我有一个for语句,对每个找到的变量执行:

for x in variables:
   #doing something....

我的问题是"做某事"是一项持续的任务......即它不会停止。那么我怎样才能在第二个变量上运行for循环?

我认为它与汇集有关,但这并不意味着我一次只能有4个进程吗?如果我想为每个50个变量运行50个独立的流程,并且每个流程都不会停止到某个时间,或者永远......我将如何做到这一点。

甚至可以这样做。

这是我的多处理代码:

if __name__ == '__main__':
x = Variable.objects.order_by('foo').values('foo')
for t in x:
    t = t.values()
    foo = "".join(t)
    info('Starting...')
    p = Process(target=myfunction, args=(foo,))
    p.start()
    p.join()

myfunction是在无限循环上运行的......

@samuel:

# globals
my_queue = multiprocessing.Manager().Queue() # queue to store our values
stop_event = multiprocessing.Event() # flag which signals processes to stop
my_pool = None

def my_function(foo):
   while not stop_event.is_set():
        print("starting %s" % foo)
        try:
             var = my_queue.get_nowait() # getting value from queue 
        except Queue.Empty:
             print "No more items in queue"
        # do you logic here


# Since `t` could have unlimited size but do wan't to limit processes
# we'll put all `t` value in queue

x = Company.objects.order_by('ticker').values('ticker')
for t in x:
    foo = t.values()
    my_queue.put(foo)

MAX_PROCESSES = len(x)
my_pool = multiprocessing.Pool(MAX_PROCESSES)

for i in range(MAX_PROCESSES):
    my_pool.apply_async(my_function, args=(foo,))
my_pool.close()
my_pool.join()

1 个答案:

答案 0 :(得分:0)

这就是使用多处理库实现解决方案的方式。

我们将使用Poolapply_asyncQueue

# globals
MAX_PROCESSES = 50
my_queue = multiprocessing.Manager().Queue() # queue to store our values
stop_event = multiprocessing.Event() # flag which signals processes to stop
my_pool = None

def my_function(proc_name, var):
   while not stop_event.is_set():
        # do you logic here with var variable


def var_scanner_process():
    # Since `t` could have unlimited size we'll put all `t` value in queue 
    while not stop_event.is_set(): # forever scan `values` for new items
        x = Variable.objects.order_by('foo').values('foo')
        for t in x:
            t = t.values()
            my_queue.put(t)
        time.sleep(10) 

try:
    var_scanner_process = Process(target=var_scanner)
    var_scanner_process.start()
    my_pool = multiprocessing.Pool(MAX_PROCESSES)

    while not stop_event.is_set():
        try: #  if queue isn't empty, get value from queue and create new process
             var = my_queue.get_nowait() # getting value from queue 
             p = Process(target=my_function, args=("process-%d" % i, var))
             p.start()
        exception Queue.Empty:
             print "No more items in queue"

except KeyboardInterrupt as stop_test_exception:
     print(" CTRL+C pressed. Stopping test....")
     stop_event.set()