如何在喀拉拉邦实现这种损失

时间:2019-06-30 02:40:18

标签: tensorflow keras

我想像这样实现这种损失: Loss

这是代码:

def loss(output, target, from_logits=False):


        L1 = - tf.reduce_sum(target * tf.log(output), reduction_indices=len(output.get_shape()) - 1)
        L2 = tf.reduce_sumr(tf.square(tf.subtract(tf.reduce_sum(tf.multiply(j,output),  reduction_indices=len(output.get_shape()) - 1), y))))
        L3 = tf.reduce_sum(tf.reduce_sum(tf.multiply(output,tf.square(tf.subtract(j,tf.reduce_prod(tf.multiply(k,p<sub>i,k</sub>)))))))
        loss = L1 + L2 + L3
        return loss
  • 我实施的公式怎么样?是吗?

请帮我。非常感谢。

1 个答案:

答案 0 :(得分:1)

  • 假设您有y_true,形状为(samples, 11),其中一种是热编码的。
  • 假设您在最后一层使用softmax(类sum = 1)激活

Keras的损失形式为def func(y_true, y_pred):

import keras.backend as K
from keras.losses import categorical_crossentropy

def loss(y_true, y_pred):

    #p
    p = y_pred                                        #(samples,11)

    #j or k
    j = K.cast_to_floatx(K.arange(10,21))             #(11,)
    j = K.reshape(j, (1,-1))                    #(1,11)

    #y_true as class number
    y = K.sum(y_true * j, axis=-1, keepdims=True)    #(samples, 1)

    #first term:
    L1 = categorical_crossentropy(y_true, y_pred)     #(samples,)

    #second term with y outstide sum in j
    kpk = j * p                                   #(samples, 11)
    kpkSum = K.sum(kpk, axis=-1, keepdims=True)   #(samples, 1)
    L2 = kpkSum - y                               #(samples, 1)
    L2 = K.square(L2) / 2.                        #(samples, 1)

    #third term:
    L3 = K.square(j - kpkSum)                      #(samples,11)
    L3 = p * L3                                    #(samples,11)
    L3 = K.sum(L3, axis=-1)                        #(samples,)

    return L1 + L2 + L3 #the mean in N is automatic by Keras