With tensorflow
, I've made a dataset = tf.data.TFRecordDataset(filename)
and iterator = dataset.make_one_shot_iterator()
. Then in each round iterator.get_next()
would give out a mini-batch of data as input.
I am training a network with Dropout
layer, so I'm supposed to write something like this:
sess.run(train_op,feed_dict={keep_prob:0.5})
accuracy,loss = sess.run([acc,loss],feed_dict={keep_prob:1.0})
in which keep_prob
represents the probability to keep a neuron alive, that differs in training and testing (at this place is the evaluating) process.
The problem arises here is each sess.run()
triggers the iterator.get_next()
to get a new batch of input. This is not what it was supposed to be like.
What should I do if I wanna these two sess.run()
have the same input tensors?
Thank you very much :-)
答案 0 :(得分:0)
圣诞快乐!
非常感谢圣诞老人的礼物:-)
我刚刚被引导到this place,在这里您可以找到该问题的答案。
主要思想是使用tf.data.Iterator.from_structure()
而非tf.data.Dataset.make_initializable_iterator()
创建迭代器,并分别初始化迭代器以进行训练,验证和测试数据集。
圣诞快乐,新年快乐!