多处理Queue.get()挂起

时间:2016-05-15 22:05:18

标签: python database queue multiprocessing

我正在尝试实现基本的多处理,但我遇到了一个问题。 python脚本附在下面。

import time, sys, random, threading
from multiprocessing import Process
from Queue import Queue
from FrequencyAnalysis import FrequencyStore, AnalyzeFrequency

append_queue = Queue(10)
database = FrequencyStore()

def add_to_append_queue(_list):
    append_queue.put(_list)

def process_append_queue():
    while True:
        item = append_queue.get()
        database.append(item)
        print("Appended to database in %.4f seconds" % database.append_time)
        append_queue.task_done()
    return

def main():
    database.load_db()
    print("Database loaded in %.4f seconds" % database.load_time)
    append_queue_process = Process(target=process_append_queue)
    append_queue_process.daemon = True
    append_queue_process.start()
    #t = threading.Thread(target=process_append_queue)
    #t.daemon = True
    #t.start()

    while True:
        path = raw_input("file: ")
        if path == "exit":
            break
        a = AnalyzeFrequency(path)
        a.analyze()
        print("Analyzed file in %.4f seconds" % a._time)
        add_to_append_queue(a.get_results())

    append_queue.join()
    #append_queue_process.join()
    database.save_db()
    print("Database saved in %.4f seconds" % database.save_time)
    sys.exit(0)

if __name__=="__main__":
    main()

AnalyzeFrequency分析文件中单词的频率,get_results()返回所述单词和频率的排序列表。列表非常大,可能有10000个项目。

然后将此列表传递给add_to_append_queue方法,该方法将其添加到队列中。 process_append_queue逐个获取项目并将频率添加到“数据库”。此操作比main()中的实际分析需要更长的时间,因此我尝试对此方法使用单独的过程。当我尝试使用线程模块执行此操作时,一切正常,没有错误。当我尝试使用Process时,脚本会挂起item = append_queue.get()

有人可以解释一下这里发生了什么,也许可以指导我解决问题吗?

所有答案都赞赏!

更新

泡菜错误是我的错,这只是一个错字。现在我在多处理中使用Queue类,但append_queue.get()方法仍然挂起。 新代码

import time, sys, random
from multiprocessing import Process, Queue
from FrequencyAnalysis import FrequencyStore, AnalyzeFrequency

append_queue = Queue()
database = FrequencyStore()

def add_to_append_queue(_list):
    append_queue.put(_list)

def process_append_queue():
    while True:
        database.append(append_queue.get())
        print("Appended to database in %.4f seconds" % database.append_time)
    return

def main():
    database.load_db()
    print("Database loaded in %.4f seconds" % database.load_time)
    append_queue_process = Process(target=process_append_queue)
    append_queue_process.daemon = True
    append_queue_process.start()
    #t = threading.Thread(target=process_append_queue)
    #t.daemon = True
    #t.start()

    while True:
        path = raw_input("file: ")
        if path == "exit":
            break
        a = AnalyzeFrequency(path)
        a.analyze()
        print("Analyzed file in %.4f seconds" % a._time)
        add_to_append_queue(a.get_results())

    #append_queue.join()
    #append_queue_process.join()
    print str(append_queue.qsize())
    database.save_db()
    print("Database saved in %.4f seconds" % database.save_time)
    sys.exit(0)

if __name__=="__main__":
    main()

更新2

这是数据库代码:

class FrequencyStore:

    def __init__(self):
        self.sorter = Sorter()
        self.db = {}
        self.load_time = -1
        self.save_time = -1
        self.append_time = -1
        self.sort_time = -1

    def load_db(self):
        start_time = time.time()

        try:
            file = open("results.txt", 'r')
        except:
            raise IOError

        self.db = {}
        for line in file:
            word, count = line.strip("\n").split("=")
            self.db[word] = int(count)
        file.close()

        self.load_time = time.time() - start_time

    def save_db(self):
        start_time = time.time()

        _db = []
        for key in self.db:
            _db.append([key, self.db[key]])
        _db = self.sort(_db)

        try:
            file = open("results.txt", 'w')
        except:
            raise IOError

        file.truncate(0)
        for x in _db:
            file.write(x[0] + "=" + str(x[1]) + "\n")
        file.close()

        self.save_time = time.time() - start_time

    def create_sorted_db(self):
        _temp_db = []
        for key in self.db:
            _temp_db.append([key, self.db[key]])
        _temp_db = self.sort(_temp_db)
        _temp_db.reverse()
        return _temp_db

    def get_db(self):
        return self.db

    def sort(self, _list):
        start_time = time.time()

        _list = self.sorter.mergesort(_list)
        _list.reverse()

        self.sort_time = time.time() - start_time
        return _list

    def append(self, _list):
        start_time = time.time()

        for x in _list:
            if x[0] not in self.db:
                self.db[x[0]] = x[1]
            else:
                self.db[x[0]] += x[1]

        self.append_time = time.time() - start_time

2 个答案:

答案 0 :(得分:5)

评论建议您尝试在Windows上运行此功能。正如我在评论中所说,

  

如果您在Windows上运行此功能,则无法正常运行 - Windows无法运行   拥有fork(),因此每个进程都有自己的队列,但它们什么都没有   彼此做。整个模块是从头开始导入的#34;通过   Windows上的每个进程。您需要在main()中创建队列,   并将其作为参数传递给worker函数。

虽然我删除了所有数据库内容,因为它与您迄今为止所描述的问题无关,但我们正在充实您需要做些什么以使其可移植。我还删除了daemon摆弄,因为 通常只是一种懒惰的方式来避免干净地关闭事物,并且通常不会再回来咬你:

def process_append_queue(append_queue):
    while True:
        x = append_queue.get()
        if x is None:
            break
        print("processed %d" % x)
    print("worker done")

def main():
    import multiprocessing as mp

    append_queue = mp.Queue(10)
    append_queue_process = mp.Process(target=process_append_queue, args=(append_queue,))
    append_queue_process.start()
    for i in range(100):
        append_queue.put(i)
    append_queue.put(None)  # tell worker we're done
    append_queue_process.join()

if __name__=="__main__":
    main()

输出是"显而易见的"东西:

processed 0
processed 1
processed 2
processed 3
processed 4
...
processed 96
processed 97
processed 98
processed 99
worker done

注意:由于Windows不能(不能)fork(),因此工作进程无法继承 Windows上的任何Python对象。每个进程从一开始就运行整个程序。这就是为什么您的原始程序无法运行的原因:每个流程都创建了自己的Queue,与其他流程中的Queue完全无关。在上面显示的方法中,只有主进程创建Queue,主进程将它(作为参数)传递给工作进程。

答案 1 :(得分:4)

queue.Queue是线程安全的,但不会跨进程工作。不过,这很容易解决。而不是:

from multiprocessing import Process
from Queue import Queue

你想:

from multiprocessing import Process, Queue