tf.nn.sparse_softmax_cross_entropy_with_logits的意外输出

时间:2019-04-08 12:36:54

标签: python tensorflow neural-network

tf.nn.sparse_softmax_cross_entropy_with_logits的TensorFlow文档明确声明,我不应该将softmax应用于此操作的输入:

  

此操作期望未缩放的logit,因为它对logits执行softmax   内部提高效率。不要使用以下输出调用此操作   softmax,因为它会产生错误的结果。

但是,如果我在没有softmax的情况下使用交叉熵,则会给我带来意想不到的结果。根据{{​​3}},对于CIFAR-10,预期损失值约为2.3:

  

例如,对于具有Softmax分类器的CIFAR-10,我们期望   初始损失为2.302,因为我们期望扩散概率   每个类别0.1(因为有10个类别),以及Softmax损失   是正确类别的负对数概率,因此:-ln(0.1)=   2.302。

但是,如果没有softmax,我会得到更大的值,例如108.91984。

sparse_softmax_cross_entropy_with_logits我到底在做什么错? TF代码如下所示。

import tensorflow as tf
import numpy as np
from tensorflow.python import keras


(_, _), (x_test, y_test) = keras.datasets.cifar10.load_data()
x_test = np.reshape(x_test, [-1, 32, 32, 3])

y_test = np.reshape(y_test, (10000,))
y_test = y_test.astype(np.int32)

x = tf.placeholder(dtype=tf.float32, shape=(None, 32, 32, 3))
y = tf.placeholder(dtype=tf.int32, shape=(None,))

layer = tf.layers.Conv2D(filters=16, kernel_size=3)(x)
layer = tf.nn.relu(layer)
layer = tf.layers.Flatten()(layer)
layer = tf.layers.Dense(units=1000)(layer)
layer = tf.nn.relu(layer)
logits = tf.layers.Dense(units=10)(layer)

# If this line is uncommented I get expected value around 2.3
# logits = tf.nn.softmax(logits)

loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y,
                                                      logits=logits)
loss = tf.reduce_mean(loss, name='cross_entropy')

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

    res = sess.run(loss, feed_dict={x: x_test[0:256], y: y_test[0:256]})
    print("loss: ", res)
    # Expected output is value close to 2.3
    # Real outputs are 108.91984, 72.82324, etc.

1 个答案:

答案 0 :(得分:1)

The issue is not in the lines

# If this line is uncommented I get expected value around 2.3
# logits = tf.nn.softmax(logits)

Images in cifar10 dataset are in RGB, thus pixel values are in range [0, 256). If you divide your x_test by 255

x_test = np.reshape(x_test, [-1, 32, 32, 3]).astype(np.float32) / 255

the values will be rescaled to [0,1] and tf.nn.sparse_softmax_cross_entropy_with_logits will return expected values