在自定义keras图层中循环遍历每个样本,然后返回

时间:2019-12-26 15:22:29

标签: python keras layer

我想实现一个自定义层。基本上,我想使用同一向量的始终较小的子集。因此,我将重复的向量作为一批输入到我的自定义层中。我认为很容易遍历批次的每个样本并重新组装相同形状的结果张量。看起来好像没有。所以在这里我有什么想法可以达到预期的效果吗?

import tensorflow as tf
from keras import backend as K
from keras.initializers import Constant
from keras.layers import Layer

class MyLayer(Layer):
    def __init__(self):
        super().__init__()

def build(self, input_shape):
    self.kernel = self.add_weight(name='kernel',
                                  shape=(self.parameters,),
                                  initializer=Constant(0.5),
                                  trainable=True)

    # Be sure to call this at the end
    super().build(input_shape)

def call(self, x, **kwargs):
    res = []
    # does not work, None Type is not iterable
    # for sample in range(K.shape(x)[0]):
    for sample in range(2):
        xs = x[sample][(sample * 5):]
        # simulate some verbose math on xs ..
        calc = K.zeros(xs.shape)
        res.append(K.concatenate([x[0][:(sample * 5)], calc]))

    # fails as well :-(
    # InvalidArgumentError: slice index 1 of dimension 0 out of bounds. [[{{node lppl_layer_74/strided_slice_12}}]]
    return K.stack(res) 

要拟合模型,我只使用一些垃圾数据:

model = Sequential([MyLayer()])
model.compile(loss='mse', optimizer=SGD(0.2, 0.01))
x = np.random.random((100,)
x = x.reshape(1, -1)
x2 = np.vstack([x, x])

model.fit(x2, x2, epochs=5000, verbose=0, batch_size=x.shape[0], callbacks=[EarlyStopping('loss')])

0 个答案:

没有答案