如何在Keras / Tensorflow中为LSTM添加可训练的重量

时间:2019-02-11 09:07:42

标签: tensorflow keras lstm

我想为LSTM添加可训练的权重,并且当我使用Keras提供的以下包装器时,张量已初始化,但未添加到LSTM层中。当我在密集层或卷积网络上使用相同的代码时,它可以正常工作。还有其他方法可以将变量添加到递归模型中吗?

def __init__(self, output_dim, **kwargs):
    self.output_dim = output_dim
    super(MyLayer, self).__init__(**kwargs)

def build(self, input_shape):
    assert isinstance(input_shape, list)
    # Create a trainable weight variable for this layer.
    self.kernel = self.add_weight(name='kernel',
                                  shape=(input_shape[0][1], self.output_dim),
                                  initializer='uniform',
                                  trainable=True)
    super(MyLayer, self).build(input_shape)  # Be sure to call this at the end

def call(self, x):
    assert isinstance(x, list)
    a, b = x
    return [K.dot(a, self.kernel) + b, K.mean(b, axis=-1)]

def compute_output_shape(self, input_shape):
    assert isinstance(input_shape, list)
    shape_a, shape_b = input_shape
    return [(shape_a[0], self.output_dim), shape_b[:-1]]

0 个答案:

没有答案