使用张量流和多处理时无法腌制_thread.lock对象

时间:2018-07-08 13:15:53

标签: python tensorflow python-multiprocessing

我使用张量流构造张量。为了使其更快,我进一步使用了多处理。但是,我遇到错误无法腌制_thread.lock对象 以下是代码

import math
import tensorflow as tf
import numpy as np
import time
import multiprocessing
import os,sys
from multiprocessing import Queue

def mp_factorizer(input_layer, chunksize, nprocs):
    def worker(input_layer, chunksize, out_q):       

        temp = []
        with tf.device('/cpu:0'):
            for ii in range(chunksize):

                temp.append(tf.gather_nd(input_layer,[[ii,0]])) 
                #temp.append(1) 
        out_q.put(temp)
        print('ok')

    # Each process will get 'chunksize' nums and a queue to put his out
    # dict into
    out_q = Queue()
    procs = []

    for i in range(nprocs):
        p = multiprocessing.Process(
                target=worker,
                args=(input_layer,\
                      chunksize,\
                      out_q))
        procs.append(p)
        p.start()

    # Collect all results into a single result dict. We know how many dicts
    # with results to expect.
    resultdict = []
    for i in range(nprocs):
        resultdict = out_q.get() + resultdict

    # Wait for all worker processes to finish
    for p in procs:
        p.join()

    return resultdict


input_layer = tf.placeholder(tf.float64, shape=[64,4], name='input_layer')

nprocs=4
chunksize=16

aa=time.time()
collector = mp_factorizer(input_layer, chunksize, nprocs)
loss = tf.add_n(collector, name='loss')

如果我更改了temp.append(tf.gather_nd(input_layer,[[ii,0]])) 进入temp.append(1) 不会有任何错误警告我。

似乎张量流张量被阻塞。 有人知道如何解决这个问题吗?

0 个答案:

没有答案