Python锁定关键部分

时间:2015-04-22 13:10:59

标签: python multiprocessing python-multiprocessing

我正在尝试使用Python中的多处理库来同时处理“测试”。我有一个存储在变量test_files中的测试列表。我希望工作人员从test_files中删除测试并调用它们的process_test功能。但是,当我运行此代码时,两个进程都运行相同的测试。我似乎没有以线程安全的方式访问test_files。我做错了什么?

代码

def process_worker(lock, test_files)
    # Keep going until we run out of tests
    while True:
        test_file = None
        # Critical section of code
        lock.acquire()
        try:
            if len(test_files) != 0:
                test_file = test_files.pop()
        finally:
            lock.release()
        # End critical section of code

        # If there is another test in the queue process it
        if test_file is not None:
            print "Running test {0} on worker {1}".format(test_file, multiprocessing.current_process().name)
            process_test(test_file)
        else:
            # No more tests to process
            return

# Mutex for workers
lock = multiprocessing.Lock()

# Declare our workers
p1 = multiprocessing.Process(target = process_worker, name = "Process 1", args=(lock, test_files))
p2 = multiprocessing.Process(target = process_worker, name = "Process 2", args=(lock, test_files))

# Start processing
p1.start()
p2.start()

# Block until both workers finish
p1.join()
p2.join()

输出

Running test "BIT_Test" on worker Process 1
Running test "BIT_Test" on worker Process 2

2 个答案:

答案 0 :(得分:4)

尝试分享这样的列表而不是正确的方法。您应该使用过程安全的数据结构,例如multiprocessing.Queue,或者更好,使用multiprocessing.Pool并让它为您处理排队。您正在做的事情非常适合Pool.map

import multiprocessing

def process_worker(test_file):
    print "Running test {0} on worker {1}".format(test_file, multiprocessing.current_process().name)
    process_test(test_file)


p = multiprocessing.Pool(2) # 2 processes in the pool
# map puts each item from test_files in a Queue, lets the
# two processes in our pool pull each item from the Queue,
# and then execute process_worker with that item as an argument.
p.map(process_worker, test_files)
p.close()
p.join()

更简单!

答案 1 :(得分:3)

您也可以使用multiprocessing.Manager

import multiprocessing

def process_worker(lock, test_files):
    # Keep going until we run out of tests
    while True:
        test_file = None
        # Critical section of code
        lock.acquire()
        try:
            if len(test_files) != 0:
                test_file = test_files.pop()
        finally:
            lock.release()
        # End critical section of code

        # If there is another test in the queue process it
        if test_file is not None:
            print "Running test %s on worker %s" % (test_file, multiprocessing.current_process().name)
            #process_test(test_file)
        else:
            # No more tests to process
            return

# Mutex for workers
lock = multiprocessing.Lock()
manager = multiprocessing.Manager()

test_files = manager.list(['f1', 'f2', 'f3'])

# Declare our workers
p1 = multiprocessing.Process(target = process_worker, name = "Process 1", args=(lock, test_files))
p2 = multiprocessing.Process(target = process_worker, name = "Process 2", args=(lock, test_files))

# Start processing
p1.start()
p2.start()

# Block until both workers finish
p1.join()
p2.join()