这可能吗? 反卷积层的对应层是tf.conv2d_transpose(),但是文档指出它只是一个转置转换层,而不是真正的反卷积层。 那么如何计算去反层的权重呢? 像下面的代码一样,如何使y == x(通过从W计算W2)?这可能吗?还是唯一的方法是训练去转化层?
# [batch, height, width, depth]
x_image = tf.placeholder(tf.float32,shape=[3,2])
x = tf.reshape(x_image,[1,3,2,1])
#Filter: W [kernel_height, kernel_width, output_depth, input_depth]
W_cpu = np.array([[-1,1]],dtype=np.float32)
W = tf.Variable(W_cpu)
W = tf.reshape(W, [1,2,1,1])
W_cpu2 = np.array([[-1,1]],dtype=np.float32)
W2 = tf.Variable(W_cpu2)
W2 = tf.reshape(W2, [1,2,1,1])
strides=[1, 1, 1, 1]
padding='VALID'
z = tf.nn.conv2d(x, W, strides=strides, padding=padding)
y = tf.nn.conv2d_transpose(z, W2, [1,3,2,1],strides, padding)
x_data = np.array([[1,-1],[2,2],[1,2]],dtype=np.float32)
with tf.Session() as sess:
init = tf.initialize_all_variables()
sess.run(init)
x = (sess.run(x, feed_dict={x_image: x_data}))
W = (sess.run(W, feed_dict={x_image: x_data}))
z = (sess.run(z, feed_dict={x_image: x_data}))
y = (sess.run(y, feed_dict={x_image: x_data}))
print("The shape of x:\t", x.shape, ",\t and the x.reshape(3,2) is :")
print(x.reshape(3,2))
print()
print ("The shape of x:\t", W.shape, ",\t and the W.reshape(1,2) is :")
print (W.reshape(1,2))
print ("")
print ("The shape of z:\t", W.shape, ",\t and the W.reshape(1,2) is :")
print (z.reshape(3))
print ("")
print ("The shape of y:\t", y.shape, ",\t and the y.reshape(3,3) is :")
print (y.reshape(3,2))
print ("")
但是在Matthew D. Zeiler和Rob Fergus 的可视化和理解卷积网络中,他们提出了一种对“转置”进行反变换的方法。 相同过滤器*的版本,这是否意味着形状已移置或包含权重?