我可以为变量添加占位符并同时使用队列吗?

时间:2016-09-26 17:07:21

标签: python-3.x queue tensorflow placeholder

用于在张量流中提供我的网络数据我正在使用tf.train.batch但我也希望以预定义的学习率“提供”训练。我有一个函数_getLearningRate,它创建一个列表o我想用于学习率的值,其中我确保元素的类型是float32。但是我收到了一个错误

W tensorflow/core/framework/op_kernel.cc:940] Invalid argument: You must feed a value for placeholder tensor 'Placeholder' with dtype float
     [[Node: Placeholder = Placeholder[dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/gpu:0"]()]]

所以我想知道这可能是因为我同时处理了几个批次,所以第一个确实用lr值进行处理,但队列中的其他人只是空的。或许还有另一个原因呢? 我的代码如下所示:

`def launch_net():
  with tf.Graph().as_default(): 
    global_step = tf.Variable(0,trainable = False)
    lr = tf.placeholder(tf.float32)
    input_images = inputEKG.data
    input_labels = inputEKG.labels
    # queue runner slicing input tensors into single examples
    image, label = tf.train.slice_input_producer(
          [input_images, input_labels], num_epochs=args.num_epochs)
    label = tf.cast(label, tf.int32)
    # queue runner creates batches
    images, labels = tf.train.batch([image, label], batch_size=args.bsize) 

    logits = inference(images, args, True)

    total_loss = loss(logits, labels, args)

    train_op = train(total_loss, global_step, args, lr)

    saver = tf.train.Saver(tf.all_variables(),max_to_keep=10)
    summary_op = tf.merge_all_summaries()
    init  = tf.group(tf.initialize_all_variables(), tf.initialize_local_variables())
    sess = tf.Session()

    sess.run(init)
    summary_writer = tf.train.SummaryWriter(args.train_dir, sess.graph)

    coord = tf.train.Coordinator()
    threads = tf.train.start_queue_runners(sess=sess, coord=coord)

    try:
      while not coord.should_stop(): 

        global_step_value = sess.run(global_step)
        _, loss_value = sess.run([logits,total_loss])

        lr_ = _getLearningRate(global_step_value, args)

        sess.run(train_op, feed_dict={lr: lr_})
`

当我用sess.run提供后者feed_dict={lr: lr_}时它开始工作,所以它与队列无关。仍然不太明白为什么会有效。

0 个答案:

没有答案