tf.nn.sparse_softmax_cross_entropy_with_logits - 排名错误

时间:2017-09-27 11:29:59

标签: tensorflow softmax cross-entropy

这是我的代码:

import tensorflow as tf
    with tf.Session() as sess:
        y = tf.constant([0,0,1])
        x = tf.constant([0,1,0])
        r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
        sess.run()
        print(r.eval())

它会产生以下错误:

ValueError                                Traceback (most recent call last)
<ipython-input-10-28a8854a9457> in <module>()
      4     y = tf.constant([0,0,1])
      5     x = tf.constant([0,1,0])
----> 6     r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
      7     sess.run()
      8     print(r.eval())

~\AppData\Local\conda\conda\envs\tensorflow\lib\site-packages\tensorflow\python\ops\nn_ops.py in sparse_softmax_cross_entropy_with_logits(_sentinel, labels, logits, name)
   1687       raise ValueError("Rank mismatch: Rank of labels (received %s) should "
   1688                        "equal rank of logits minus 1 (received %s)." %
-> 1689                        (labels_static_shape.ndims, logits.get_shape().ndims))
   1690     # Check if no reshapes are required.
   1691     if logits.get_shape().ndims == 2:

ValueError: Rank mismatch: Rank of labels (received 1) should equal rank of logits minus 1 (received 1).

有人可以帮我理解这个错误吗?如何计算softmax和手动计算交叉熵是相当直接的。

另外,我如何使用此功能,我需要将批量输入其中(2个暗淡的数组)?

更新

我也尝试过:

import tensorflow as tf

with tf.Session() as sess:
    y = tf.constant([1])
    x = tf.constant([0,1,0])
    r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
    sess.run()
    print(r.eval())

并且它生成了相同的错误

1 个答案:

答案 0 :(得分:1)

为你修好了。 x需要是二维矢量

with tf.Session() as sess:
    y = tf.constant([1])
    x = tf.expand_dims(tf.constant([0.0, 1.0, 0.0]), 0)
    r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
    print(r.eval())