假设我有一个张量:
tensor = tf.constant(
[[[0.05340263, 0.27248233, 0.49127685, 0.07926575, 0.96054204],
[0.50013988, 0.05903472, 0.43025479, 0.41379231, 0.86508251],
[0.02033722, 0.11996034, 0.57675261, 0.12049974, 0.65760677],
[0.71859089, 0.22825203, 0.64064407, 0.47443116, 0.64108334]],
[[0.18813498, 0.29462021, 0.09433628, 0.97393446, 0.33451445],
[0.01657461, 0.28126666, 0.64016929, 0.48365073, 0.26672697],
[0.9379696 , 0.44648103, 0.39463243, 0.51797975, 0.4173626 ],
[0.89788558, 0.31063058, 0.05492096, 0.86904097, 0.21696292]],
[[0.07279436, 0.94773635, 0.34173115, 0.7228713 , 0.46553334],
[0.61199848, 0.88508141, 0.97019517, 0.61465985, 0.48971128],
[0.53037002, 0.70782324, 0.32158754, 0.2793538 , 0.62661128],
[0.52787814, 0.17085317, 0.83711126, 0.40567032, 0.71386498]]])
形状为(3,4,5)
我想对其进行切片以返回形状为(3,5)的新张量,其中具有给定的1D张量,其值指示要检索的位置,例如:
index_tensor = tf.constant([2,1,3])
这将导致一个新的张量,如下所示:
[[0.02033722, 0.11996034, 0.57675261, 0.12049974, 0.65760677],
[0.01657461, 0.28126666, 0.64016929, 0.48365073, 0.26672697],
[0.52787814, 0.17085317, 0.83711126, 0.40567032, 0.71386498]]
,即沿着第二维,从索引2、1,和3中取得项目。 类似于:
tensor[:,x,:]
除了这只会给我沿维度在索引“ x”处的项目,我希望它具有灵活性。
可以做到吗?
答案 0 :(得分:1)
您可以使用tf.one_hot()
来掩盖index_tensor
。
index = tf.one_hot(index_tensor,tensor.shape[1])
[[0. 0. 1. 0.]
[0. 1. 0. 0.]
[0. 0. 0. 1.]]
然后通过tf.boolean_mask()
得到结果。
result = tf.boolean_mask(tensor,index)
[[0.02033722 0.11996034 0.57675261 0.12049974 0.65760677]
[0.01657461 0.28126666 0.64016929 0.48365073 0.26672697]
[0.52787814 0.17085317 0.83711126 0.40567032 0.71386498]]
答案 1 :(得分:0)
tensor = tf.constant(
[[[0.05340263, 0.27248233, 0.49127685, 0.07926575, 0.96054204],
[0.50013988, 0.05903472, 0.43025479, 0.41379231, 0.86508251],
[0.02033722, 0.11996034, 0.57675261, 0.12049974, 0.65760677],
[0.71859089, 0.22825203, 0.64064407, 0.47443116, 0.64108334]],
[[0.18813498, 0.29462021, 0.09433628, 0.97393446, 0.33451445],
[0.01657461, 0.28126666, 0.64016929, 0.48365073, 0.26672697],
[0.9379696 , 0.44648103, 0.39463243, 0.51797975, 0.4173626 ],
[0.89788558, 0.31063058, 0.05492096, 0.86904097, 0.21696292]],
[[0.07279436, 0.94773635, 0.34173115, 0.7228713 , 0.46553334],
[0.61199848, 0.88508141, 0.97019517, 0.61465985, 0.48971128],
[0.53037002, 0.70782324, 0.32158754, 0.2793538 , 0.62661128],
[0.52787814, 0.17085317, 0.83711126, 0.40567032, 0.71386498]]])
with tf.Session() as sess :
sess.run( tf.global_variables_initializer() )
print(sess.run( tf.concat( [ tensor[0:1,2:3], tensor[1:2,1:2], tensor[2:3,3:4] ] , 1 ) ))
这将打印出这样的值。
[[[0.02033722 0.11996034 0.5767526 0.12049974 0.6576068 ]
[0.01657461 0.28126666 0.64016926 0.48365074 0.26672697]
[0.52787817 0.17085317 0.83711123 0.40567032 0.713865 ]]]