ValueError:仅使用命名参数调用`softmax_cross_entropy_with_logits`(labels = ...,logits = ...,...)

时间:2017-11-13 04:18:50

标签: python tensorflow anaconda

你能指导如何解决这个问题吗?

with tf.name_scope('loss'):
    #cross_entropy = None
    val = tf.nn.softmax_cross_entropy_with_logits(y_conv, y_)
    cross_entropy = tf.reduce_mean(val)

with tf.name_scope('adam_optimizer'):
    #train_step = None
    train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)

我收到此错误:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-40-f67d0aecc114> in <module>()
      1 with tf.name_scope('loss'):
      2     #cross_entropy = None
----> 3     val = tf.nn.softmax_cross_entropy_with_logits(y_conv, y_)
      4     cross_entropy = tf.reduce_mean(val)
      5 

~/anaconda/lib/python3.6/site-packages/tensorflow/python/ops/nn_ops.py in softmax_cross_entropy_with_logits(_sentinel, labels, logits, dim, name)
   1576   """
   1577   _ensure_xent_args("softmax_cross_entropy_with_logits", _sentinel,
-> 1578                     labels, logits)
   1579 
   1580   # TODO(pcmurray) Raise an error when the labels do not sum to 1. Note: This

~/anaconda/lib/python3.6/site-packages/tensorflow/python/ops/nn_ops.py in _ensure_xent_args(name, sentinel, labels, logits)
   1531   if sentinel is not None:
   1532     raise ValueError("Only call `%s` with "
-> 1533                      "named arguments (labels=..., logits=..., ...)" % name)
   1534   if labels is None or logits is None:
   1535     raise ValueError("Both labels and logits must be provided.")

ValueError: Only call `softmax_cross_entropy_with_logits` with named arguments (labels=..., logits=..., ...)

此外,tf.__version__返回'1.0.0' 我在OSX Sierra上有Anaconda Python 3.6.2

1 个答案:

答案 0 :(得分:5)

这是一个简单的解决方法:softmax_cross_entropy_with_logits()有三个相关参数(按顺序):_sentinellabelslogits。 sentinel必须为空或引发错误,这需要使用命名参数。

已修复(虽然我不确定y_convy_在这种情况下是否为标签或logit,因此您可能需要交换它们:

with tf.name_scope('loss'):
    #cross_entropy = None
    val = tf.nn.softmax_cross_entropy_with_logits(labels = y_conv, logits=y_)
    cross_entropy = tf.reduce_mean(val)

with tf.name_scope('adam_optimizer'):
    #train_step = None
    train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)