如何在Keras手动运行会话中获得可训练的重量?

时间:2016-10-27 05:33:56

标签: keras

因为我手动运行会话,所以我似乎无法收集特定图层的可训练权重。

    x = Convolution2D(16, 3, 3, init='he_normal', border_mode='same')(img)

    for i in range(0, self.blocks_per_group):
        nb_filters = 16 * self.widening_factor
        x = residual_block(x, nb_filters=nb_filters, subsample_factor=1)

    for i in range(0, self.blocks_per_group):
        nb_filters = 32 * self.widening_factor
        if i == 0:
            subsample_factor = 2
        else:
            subsample_factor = 1
        x = residual_block(x, nb_filters=nb_filters, subsample_factor=subsample_factor)

    for i in range(0, self.blocks_per_group):
        nb_filters = 64 * self.widening_factor
        if i == 0:
            subsample_factor = 2
        else:
            subsample_factor = 1
        x = residual_block(x, nb_filters=nb_filters, subsample_factor=subsample_factor)

    x = BatchNormalization(axis=3)(x)
    x = Activation('relu')(x)
    x = AveragePooling2D(pool_size=(8, 8), strides=None, border_mode='valid')(x)
    x = tf.reshape(x, [-1, np.prod(x.get_shape()[1:].as_list())])

    # Readout layer
    preds = Dense(self.nb_classes, activation='softmax')(x)

   loss = tf.reduce_mean(categorical_crossentropy(labels, preds))

    optimizer = tf.train.GradientDescentOptimizer(0.5).minimize(loss)

    with sess.as_default():

        for i in range(10):

            batch = self.next_batch(self.batch_num)
            _, l = sess.run([optimizer, loss],
                            feed_dict={img: batch[0], labels: batch[1]})
            print(l)
            print(type(weights))

我正在尝试获取最后一个卷积层的权重。

我尝试get_trainable_weights(layer)layer.get_weights(),但我无法到达任何地方。

错误

AttributeError: 'Tensor' object has no attribute 'trainable_weights'

1 个答案:

答案 0 :(得分:-1)

从查看源*看起来像是在寻找layer.trainable_weights(它是一个列表而不是成员函数)。请注意这会返回张量。

如果您想获得实际值,您需要在会话中评估它们:

Hello Alice, my name is Bob

* https://github.com/fchollet/keras/blob/master/keras/layers/convolutional.py#L401