在Keras自定义损失函数中计算递归均值

时间:2020-10-16 11:41:54

标签: keras tensorflow2.0

我有一个非常规的用例,需要在Keras的自定义损失函数中计算递归均值。

递归均值的计算方式为(我希望有数学运算符)$ m_t = m_ {t-1} +(x_t-m_ {t-1})/ t $。因此,我破解了一个在模型外部创建tf.Variable的解决方案,如下所示:

def my_loss(m_t, m_t_1, t):
    def _my_loss(y_true, y_pred):
        m_t_1.assign(m_t)
        t.assign_add(1.)

        x = K.mean(y_true * y_pred)
        m_t.assign(m_t_1 + (x - m_t_1) / t)

        return -1. * m_t

    return _my_loss

input_layer = Input(shape=(5))
output_layer = Dense(1, activation='linear')(input_layer)
m_t = tf.Variable(initial_value=0., shape=tf.TensorShape(None), trainable=False)
m_t_1 = tf.Variable(initial_value=0., shape=tf.TensorShape(None), trainable=False)
t = tf.Variable(initial_value=0., shape=tf.TensorShape(None), trainable=False)
loss = my_loss(m_t, m_t_1, t)

model = keras.Model(input_layer, output_layer)
model.compile(loss=loss, optimizer='sgd')

model.train_on_batch(x=np.random.normal(0., 1., (10, 5)), y=np.random.normal(0., 1., (10, 1)))

,我收到以下错误消息:

ValueError: in converted code:

    C:\Users\Steven\Anaconda3\envs\tf\lib\site-packages\tensorflow_core\python\keras\engine\training_eager.py:305 train_on_batch  *
        outs, total_loss, output_losses, masks = (
    C:\Users\Steven\Anaconda3\envs\tf\lib\site-packages\tensorflow_core\python\keras\engine\training_eager.py:273 _process_single_batch
        model.optimizer.apply_gradients(zip(grads, trainable_weights))
    C:\Users\Steven\Anaconda3\envs\tf\lib\site-packages\tensorflow_core\python\keras\optimizer_v2\optimizer_v2.py:426 apply_gradients
        grads_and_vars = _filter_grads(grads_and_vars)
    C:\Users\Steven\Anaconda3\envs\tf\lib\site-packages\tensorflow_core\python\keras\optimizer_v2\optimizer_v2.py:1039 _filter_grads
        ([v.name for _, v in grads_and_vars],))

    ValueError: No gradients provided for any variable: ['dense_2/kernel:0', 'dense_2/bias:0'].

有什么想法可以让我在损失函数中有状态吗?

0 个答案:

没有答案