InvalidArgumentError:logits和labels必须大小相同logits_size = [80,2] labels_size = [1,80]

时间:2018-03-22 02:21:31

标签: python tensorflow machine-learning conv-neural-network

我正在调整this tutorial here,因此我可以在自己的一组图像中训练一个ConvNet。

我收到此错误:

Traceback (most recent call last):
  File "scr.py", line 416, in <module>
    optimize(1)
  File "scr.py", line 390, in optimize
    session.run(optimizer, feed_dict=feed_dict_train)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 905, in run
    run_metadata_ptr)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1140, in _run
    feed_dict_tensor, options, run_metadata)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1358, in _do_run
    options, run_metadata)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1377, in _do_call
    raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InvalidArgumentError: logits and labels must be same size: logits_size=[80,2] labels_size=[1,80]
     [[Node: softmax_cross_entropy_with_logits_sg = SoftmaxCrossEntropyWithLogits[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"](softmax_cross_entropy_with_logits_sg/Reshape, softmax_cross_entropy_with_logits_sg/Reshape_1)]]

Caused by op u'softmax_cross_entropy_with_logits_sg', defined at:
  File "scr.py", line 346, in <module>
    labels=y_true)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 250, in new_func
    return func(*args, **kwargs)
  [...]
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/ops.py", line 1650, in __init__
    self._traceback = self._graph._extract_stack()  # pylint: disable=protected-access

InvalidArgumentError (see above for traceback): logits and labels must be same size: logits_size=[80,2] labels_size=[1,80]
     [[Node: softmax_cross_entropy_with_logits_sg = SoftmaxCrossEntropyWithLogits[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"](softmax_cross_entropy_with_logits_sg/Reshape, softmax_cross_entropy_with_logits_sg/Reshape_1)]]

所以错误必须来自这个电话:

x = tf.placeholder(tf.float32, shape=[None, IMG_HEIGHT, IMG_WIDTH, CHANNELS], name='x')

y_true = tf.placeholder(tf.int64, name='y_true')

layer_conv1, weights_conv1 = new_conv_layer(input=x,
                                            num_input_channels=CHANNELS,
                                            filter_size=filter_size1,
                                            num_filters=num_filters1,
                                            use_pooling=True)

layer_conv2, weights_conv2 = new_conv_layer(input=layer_conv1,
                                            num_input_channels=num_filters1,
                                            filter_size=filter_size2,
                                            num_filters=num_filters2,
                                            use_pooling=True)

layer_flat, num_features = flatten_layer(layer_conv2)

layer_fc1 = new_fc_layer(input=layer_flat,
                         num_inputs=num_features,
                         num_outputs=fc_size,
                         use_relu=True)

layer_fc2 = new_fc_layer(input=layer_fc1,
                         num_inputs=fc_size,
                         num_outputs=N_CLASSES,
                         use_relu=False)
# Predicted Class
y_pred = tf.nn.softmax(layer_fc2)
y_pred_cls = tf.argmax(y_pred, axis=1)

这是我调用softmax_cross_entropy_with_logits:

的地方
# Cost Function
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=layer_fc2,
                                                        labels=y_true)

cost = tf.reduce_mean(cross_entropy)

Full source here如果需要

logits和标签尺寸似乎不同,所以我做错了什么?

1 个答案:

答案 0 :(得分:1)

tf.nn.softmax_cross_entropy_with_logits期望概率分布为标签(通常为单热编码标签)。

如果您想使用整数标签,请使用tf.losses.sparse_softmax_cross_entropytf.nn.sparse_softmax_cross_entropy_with_logits