损失函数提供NAN梯度

时间:2020-05-09 21:39:33

标签: python keras deep-learning loss-function

我使用直方图损失作为模型的损失函数,但它提供了NAN梯度。 代码段(丢失功能):

def histogram_loss(y_true, y_pred):
    h_true = tf.histogram_fixed_width( y_true, value_range=(-1., 1.), nbins=20)
    h_pred = tf.histogram_fixed_width( y_pred, value_range=(-1., 1.), nbins=20)
    h_true = tf.cast(h_true, dtype=tf.dtypes.float32)
    h_pred = tf.cast(h_pred, dtype=tf.dtypes.float32)
    return K.mean(K.square(h_true - h_pred))

错误消息:

ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.

为什么会出现值错误(NAN梯度)?

1 个答案:

答案 0 :(得分:0)

tf.histogram的梯度为None ...这不是微分函数

x = tf.Variable(np.random.uniform(0,10, 100), dtype=tf.float32)

with tf.GradientTape() as tape:
    hist = tf.histogram_fixed_width(x, value_range=(-1., 1.), nbins=20)
    hist = tf.cast(hist, dtype=tf.dtypes.float32)

grads = tape.gradient(hist, x)
grads