keras:平滑的L1损失

时间:2017-05-23 09:26:59

标签: deep-learning keras loss-function

尝试在下面的keras中自定义丢失函数(平滑L1丢失)

  

ValueError:Shape必须为0级,但对于' cond / Switch'为5级。 (op:' Switch')输入形状:[?,24,24,24,?],[?,24,24,24,?]。

from keras import backend as K
import numpy as np


def smooth_L1_loss(y_true, y_pred):
    THRESHOLD = K.variable(1.0)
    mae = K.abs(y_true-y_pred)
    flag = K.greater(mae, THRESHOLD)
    loss = K.mean(K.switch(flag, (mae - 0.5), K.pow(mae, 2)), axis=-1)
    return loss

3 个答案:

答案 0 :(得分:4)

以下是使用keras.backend实现Smooth L1损失的实现:

{{-- status --}}
<td>{{ $entry->{$column['name'] }}</td>

答案 1 :(得分:2)

def smoothL1(y_true, y_pred):
    x = K.abs(y_true - y_pred)
    if K._BACKEND == 'tensorflow':
        import tensorflow as tf
        x = tf.where(x < HUBER_DELTA, 0.5 * x ** 2, HUBER_DELTA * (x - 0.5 * HUBER_DELTA))
        return  K.sum(x)

答案 2 :(得分:0)

我知道我晚了两年,但是如果您使用tensorflow作为keras后端,则可以使用tensorflow的Huber loss(本质上是相同的),如下所示:

import tensorflow as tf


def smooth_L1_loss(y_true, y_pred):
    return tf.losses.huber_loss(y_true, y_pred)