这是我的连体网络,功能是98个图像编码为numpy ndarray工作正常,标签是0,1,2,3,4,即他们使用tensorflow转换为onehot,工作正常。现在模式是“火车”。我正在关注和编辑mnist数据集代码的tensorflow代码。我有98张图片,可以是0..4级。一切正常,直到logits,它的尺寸与我的标签不一样是问题。
def siamese_network(features, labels, mode):
"""Model function for Siamese Network"""
input_layer = tf.reshape(features, [-1, 28, 28, 1])
conv1 = tf.layers.conv2d(
inputs=input_layer,
filters=32,
kernel_size=[5, 5],
padding="same",
activation=tf.nn.relu)
pool1 = tf.layers.max_pooling2d(inputs=conv1, pool_size=[2, 2], strides=2)
conv2 = tf.layers.conv2d(
inputs=pool1,
filters=64,
kernel_size=[5, 5],
padding="same",
activation=tf.nn.relu)
pool2 = tf.layers.max_pooling2d(inputs=conv2, pool_size=[2, 2], strides=2)
pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64])
dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu)
dropout = tf.layers.dropout(
inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN)
logits = tf.layers.dense(inputs=dropout, units=5)
print(logits) #gives (294,5)
predictions = {
"classes": tf.argmax(input=logits, axis=1),
"probabilities": tf.nn.softmax(logits, name="softmax_tensor")
}
if mode == tf.estimator.ModeKeys.PREDICT:
return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)
onehot_labels = tf.one_hot(indices=tf.cast(labels, tf.int32), depth=5)
print(onehot_labels) #dimension is (98,5)
print(logits) # dimension is (294,5)
#error in this line,until here everything is fine
loss = tf.losses.softmax_cross_entropy(
onehot_labels=onehot_labels, logits=logits)
我在损失线上得到的错误是: -
ValueError: Shapes (294, 5) and (98, 5) are incompatible
我尝试使用“pool2_flat = tf.reshape(pool2,[ - 1,7 * 7 * 64]),将poolize更改为[5,5]”并且logits的维度开始变为(147,5) ,(96,5)等,但没有任何有用的事情发生,也不能真正成为一个很好的解决方案,因为培训规模将改变尺寸也将随之改变。我哪里做错了?谢谢。