在Keras中访问自定义层的变量

时间:2019-08-21 15:36:43

标签: keras deep-learning keras-layer

假设我们在Keras中有一个自定义图层,如下所示:

import numpy as np
import tensorflow as tf
from keras import backend as K
from keras.layers import Layer


class Custom_Layer(Layer):
    def __init__(self,**kwargs):
        super(ProbabilisticActivation, self).__init__(**kwargs)
        self.params_1 = 0
        self.params_2 = 0
    def build(self, input_shape):
        self.params_1 = K.variable(np.zeros(shape=input_shape[1::]))
        self.params_2 = K.variable(np.zeros(shape=input_shape[1::]))
        super(Custom_Layer,self).build(input_shape) 

    def call(self, x, training=None): 
       # DO SOMETHING

在培训过程中如何访问参数值(params_1,params_2)?我尝试使用 model.get_layer('自定义图层的名称').params_1 获取参数,但是在这种情况下,我无法访问参数的值。

这是模型架构:

def get_model(img_height, img_width:
    input_layer = Input(shape=(img_height, img_width, 3))
    x = Conv2D(32, (3, 3), padding='same', name='conv2d_1', activation='relu')(input_layer)
    x = Custom_Layer()(x)
    x = MaxPooling2D(pool_size=(2, 2))(x)
    x = Dropout(0.25)(x)
    x = Conv2D(64, kernel_size=(3, 3), name='conv2d_2', activation='relu')(x)
    x = Conv2D(64, (3, 3), name='conv2d_4', activation='relu')(x)
    x = MaxPooling2D(pool_size=(2, 2))(x)
    x = Dropout(0.25)(x)
    x = Flatten()(x)
    x = Dense(512)(x)
    x = Activation('relu')(x)
    x = Dropout(0.5)(x)
    x = Dense(10)(x)
    x = Activation('softmax')(x)
    model = Model(inputs=[input_layer], outputs=[x])
    model.summary()

    return model

1 个答案:

答案 0 :(得分:0)

请注意,params_1params_2是TensorFlow张量。为了获得它们的价值,您应该在tf.Session中运行它们。您可以按照以下方式进行操作:

from keras import backend as K

# ... train model

sess = K.get_session()
params_1 = model.get_layer('Name of Custom Layer').params_1
values_1 = sess.run(params_1)
print(values_1)

注意:未测试。