张量流量CNN精度始终为零

时间:2017-11-22 17:13:45

标签: tensorflow neural-network deep-learning

我正在使用CNN进行短文本分类。我知道如果过度拟合可能导致所有神经元变为零。但有些人认为,当过度拟合时,所有训练批精度都是1,这没有意义,因为并非所有真正的类别都是0.我认为准确度应该很低,但不是1。

以下是我使用的代码部分:

... Define some input placeholders here ...

pooled_outputs = []
for filter_size in filter_sizes:
    filter_shape = [filter_size, embed_dimen, 1, num_filters]
    W_filter = tf.Variable(tf.truncated_normal(filter_shape, stddev=0.1))
    b_filter = tf.Variable(tf.constant(0.1, shape=[num_filters]))
    x_embed_expanded = tf.expand_dims(x_embed, -1)
    conv = tf.nn.conv2d(x_embed_expanded, W_filter, strides=[1, 1, 1, 1], padding="VALID")
    h = tf.nn.relu(tf.nn.bias_add(conv, b_filter), name="relu")
    pooled = tf.nn.max_pool(h, ksize=[1, self.params['max_domain_segments_len'] - filter_size + 1, 1, 1],
                                        strides=[1, 1, 1, 1], padding='VALID')
    pooled_outputs.append(pooled)
h_pool = tf.concat(pooled_outputs, axis=3)
num_filters_total = num_filters * len(filter_sizes)
output_vec = tf.reshape(h_pool, [-1, num_filters_total])

logits = tf.contrib.layers.fully_connected(output_vec, num_outputs=n_rnn_neurons, activation_fn=tf.nn.relu)

logits = tf.contrib.layers.fully_connected(logits, self.params['num_targets'], activation_fn=tf.nn.relu)

crossentropy = tf.losses.sparse_softmax_cross_entropy(labels=y, logits=logits)

loss_mean = tf.reduce_mean(crossentropy)
optimizer = tf.train.AdamOptimizer(learning_rate=lr_rate)
training_op = optimizer.minimize(loss_mean)

prediction = tf.argmax(logits, axis=-1)
is_correct = tf.nn.in_top_k(logits, y, 1) # logits are unscaled, but here we only care the argmax
n_correct = tf.reduce_sum(tf.cast(is_correct, tf.float32))
accuracy = tf.reduce_mean(tf.cast(is_correct, tf.float32))

init = tf.global_variables_initializer()

with tf.Session() as sess:
    init.run()
    ......

0 个答案:

没有答案