Tensorflow梯度始终为零

时间:2016-05-17 00:06:02

标签: tensorflow convolution gradient-descent

我编写了一个小的Tensorflow程序,该程序通过连续相同的卷积内核num_unrollings次对图像补丁进行卷积,然后尝试最小化结果值与目标输出之间的均方差。

但是,当我运行num_unrollings大于1的模型时,我的损失(tf_loss)项相对于卷积内核(tf_kernel)的梯度为零,所以没有学习。

这是我能想出的最小代码(python 3),它可以重现问题,对不起长度:

import tensorflow as tf
import numpy as np

batch_size = 1
kernel_size = 3
num_unrollings = 2

input_image_size = (kernel_size//2 * num_unrollings)*2 + 1

graph = tf.Graph()

with graph.as_default():
    # Input data
    tf_input_images = tf.random_normal(
        [batch_size, input_image_size, input_image_size, 1]
    )

    tf_outputs = tf.random_normal(
        [batch_size]
    )

    # Convolution kernel
    tf_kernel = tf.Variable(
        tf.zeros([kernel_size, kernel_size, 1, 1])
    )

    # Perform convolution(s)
    _convolved_input = tf_input_images
    for _ in range(num_unrollings):
        _convolved_input = tf.nn.conv2d(
            _convolved_input, 
            tf_kernel, 
            [1, 1, 1, 1], 
            padding="VALID"
        )

    tf_prediction = tf.reshape(_convolved_input, shape=[batch_size])

    tf_loss = tf.reduce_mean(
        tf.squared_difference(
            tf_prediction,
            tf_outputs
        )
    )

    # FIXME: why is this gradient zero when num_unrollings > 1??
    tf_gradient = tf.concat(0, tf.gradients(tf_loss, tf_kernel))

# Calculate and report gradient
with tf.Session(graph=graph) as session:

    tf.initialize_all_variables().run()

    gradient = session.run(tf_gradient)

    print(gradient.reshape(kernel_size**2))
    #prints [ 0.  0.  0.  0.  0.  0.  0.  0.  0.]

感谢您的帮助!

1 个答案:

答案 0 :(得分:1)

尝试替换

# Convolution kernel
tf_kernel = tf.Variable(
    tf.zeros([kernel_size, kernel_size, 1, 1])
)

有类似的东西:

# Convolution kernel
tf_kernel = tf.Variable(
    tf.random_normal([kernel_size, kernel_size, 1, 1])
)
相关问题