“二进制交叉熵”的Tensorflow损失当量是多少?

时间:2018-08-09 08:38:05

标签: tensorflow keras loss-function

我正在尝试将Keras图重写为Tensorflow图,但想知道哪个损失函数与“二进制交叉熵”等效。是tf.nn.softmax_cross_entropy_with_logits_v2吗?

非常感谢!

1 个答案:

答案 0 :(得分:2)

否,将binary_crossentropy的带有张量流后端的实现定义为here

@tf_export('keras.backend.binary_crossentropy')
def binary_crossentropy(target, output, from_logits=False):
    """Binary crossentropy between an output tensor and a target tensor.
    Arguments:
      target: A tensor with the same shape as `output`.
      output: A tensor.
      from_logits: Whether `output` is expected to be a logits tensor.
          By default, we consider that `output`
          encodes a probability distribution.
    Returns:
      A tensor.
    """
    # Note: nn.sigmoid_cross_entropy_with_logits
    # expects logits, Keras expects probabilities.
    if not from_logits:
        # transform back to logits
        epsilon_ = _to_tensor(epsilon(), output.dtype.base_dtype)
        output = clip_ops.clip_by_value(output, epsilon_, 1 - epsilon_)
        output = math_ops.log(output / (1 - output))
    return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)

因此,它使用sigmoid_crossentropy而不是softmax_crossentropy