如何重新训练超过最后一层

时间:2017-09-30 01:50:32

标签: tensorflow

我使用tensorflow github提供的retrain.py脚本在我自己的数据集上微调预训练的inceptionV3模型。我把模型保存到磁盘,现在我想用它作为另一轮训练的起点,我重新训练所有的卷积层。下面是我试图用来创建新图表的代码。我想,一旦我将图形加载到默认图形中,我就可以使用tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES)访问其中的变量,并将它们全部设置为可训练的。但是,在任何tf.GraphKeys变量(GLOBAL_VARIABLES,TRAINABLE_VARIABLES,MODEL_VARIABLES等)中似乎都没有任何内容。因此,当我尝试创建优化器时,我得到一个错误" ValueError:没有要优化的变量。"我究竟做错了什么?

def create_graph(model_path, class_count):
    """Creates a graph from saved GraphDef file and returns a saver."""
    with tf.Graph().as_default() as graph:
    # Creates graph from saved graph_def.pb.
    with tf.gfile.FastGFile(model_path, 'rb') as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
        _ = tf.import_graph_def(graph_def, name='')

    #import logits from saved graph
    logits_tensor = graph.get_tensor_by_name("final_training_ops/biases/final_biases:0")
    ground_truth_input = tf.placeholder(tf.float32,
                                    [None, class_count],
                                    name='GroundTruthInput')

    print(tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES)              

    #connect logits to the training ops
    with tf.name_scope('cross_entropy'):
        cross_entropy = tf.nn.softmax_cross_entropy_with_logits(
           labels=ground_truth_input, logits=logits_tensor)
    with tf.name_scope('total'):
        cross_entropy_mean = tf.reduce_mean(cross_entropy)
    tf.summary.scalar('cross_entropy', cross_entropy_mean)

    with tf.name_scope('train'):
        optimizer = tf.train.GradientDescentOptimizer(0.001)
        train_step = optimizer.minimize(cross_entropy_mean)

return graph, ground_truth_input, cross_entropy_mean, train_step 

0 个答案:

没有答案