我想将张量矩阵重塑成特定的形状。在我做了操作后,我发现矩阵本身已经改变了。我不知道为什么会这样。
tf.reset_default_graph()
with tf.Session() as test:
tf.set_random_seed(1)
a_S = tf.random_normal([1, 1,1,3], mean=1, stddev=4)
a_G = tf.random_normal([1, 1,1,3], mean=1, stddev=4)
J_style_layer = compute_layer_style_cost(a_S, a_G)
print("J_style_layer = " + str(J_style_layer.eval()))
以下是被调用函数compute_layer_style_cost
的定义def compute_layer_style_cost(a_S, a_G):
"""
Arguments:
a_S -- tensor of dimension (1, n_H, n_W, n_C), hidden layer activations
representing style of the image S
a_G -- tensor of dimension (1, n_H, n_W, n_C), hidden layer activations
representing style of the image G
Returns:
J_style_layer -- tensor representing a scalar value, style cost defined
above by equation (2)
"""
### START CODE HERE ###
# Retrieve dimensions from a_G (≈1 line)
m, n_H, n_W, n_C = a_S.get_shape().as_list()
print("m=>", m, "n_H=>", n_H, "n_W=>", n_W, "n_C=>", n_C)
print("a_S.shape=>", a_S.shape)
print("a_S=>",a_S.eval())
# Reshape the images to have them of shape (n_C, n_H*n_W) (≈2 lines)
a_S = tf.reshape(a_S, [n_C, n_H*n_W])
a_G = tf.reshape(a_G, [n_C, n_H*n_W])
print("a_S.shape=>", a_S.shape)
print("a_S=>",a_S.eval())
运行之后,我得到了以下结果。
m=> 1 n_H=> 1 n_W=> 1 n_C=> 3
a_S.shape=> (1, 1, 1, 3)
a_S=> [[[[-1.68344498 1.89428568 4.18909216]]]]
a_S.shape=> (3, 1)
a_S=> [[-4.78795481]
[ 5.39861012]
[ 4.57472849]]
上述结果表明,在重构运算后,张量矩阵的值已经改变。而且我不知道为什么会发生这种情况。
答案 0 :(得分:2)
在引用Operations on random variables not working properly in Tensorflow和https://www.tensorflow.org/programmers_guide/graphs后,似乎我没有在同一个会话中运行两个随机变量计算,我将代码更改为
with tf.Session() as sess:
print(sess.run([a_S, a_S_re]))
它有效。