从TensorFlow拆栈等于转置吗?

时间:2019-02-19 17:55:03

标签: python tensorflow

在TensorFlow中实现的LSTM代码的几个示例中,我看到了unstack。因此,我也使用了它。但是,在我看来,这样做是张量的转置。例如:

timesteps = 5
num_input = 3
n_batches = 2

X = tf.placeholder("float", [None, timesteps, num_input])
Y = tf.unstack(X, timesteps, 1)

x_val = np.random.normal(size = (n_batches, timesteps, num_input))

s = tf.Session()
init = tf.global_variables_initializer()
s.run(init)

res = s.run(X, feed_dict = {X:x_val})
for r in res:
    print
    print r

print '-'*33

res = s.run(Y, feed_dict = {X:x_val})
for r in res:
    print
    print r

上面的代码返回:

[[ 0.14730155  1.2513759  -2.059696  ]
 [-1.2618986   0.11962503 -1.0680246 ]
 [ 0.9041784  -0.85666233 -1.8460879 ]
 [ 0.7830512  -0.16989689 -2.1662312 ]
 [-0.6366376  -0.54012764  0.09352247]]

[[-1.3709803  -0.9703988  -0.2918467 ]
 [-0.824392   -0.35940772  0.43680435]
 [ 1.2201993   0.6660917   0.03785486]
 [ 0.02935112  1.2725229   0.33364472]
 [-0.5590168   1.139848   -1.3916836 ]]
---------------------------------

[[ 0.14730155  1.2513759  -2.059696  ]
 [-1.3709803  -0.9703988  -0.2918467 ]]

[[-1.2618986   0.11962503 -1.0680246 ]
 [-0.824392   -0.35940772  0.43680435]]

[[ 0.9041784  -0.85666233 -1.8460879 ]
 [ 1.2201993   0.6660917   0.03785486]]

[[ 0.7830512  -0.16989689 -2.1662312 ]
 [ 0.02935112  1.2725229   0.33364472]]

[[-0.6366376  -0.54012764  0.09352247]
 [-0.5590168   1.139848   -1.3916836 ]]

所以,我的问题是:我们真的需要拆箱吗?我们不能仅仅从一开始就定义输入张量,以便轴0对应时间,轴1对应小批处理吗?

0 个答案:

没有答案