Tensorflow:Tensor reshape和pad在某些行的末尾用零填充

时间:2018-03-21 14:49:44

标签: python tensorflow

我正在寻找一种方法来重塑Tensorflow中的张量。我有一个包含行序列的张量。我想重新构造张量,使给定序列的所有行都在重新形成的张量中的一行上。

困难在于序列的长度不同。在下面的例子中,我知道一个序列最多有3行。第一个序列是2行,第二个序列是3行,第三个序列是1行。

#Data Tensor
[
[1,1,1],
[2,2,2],
[4,4,4],
[5,5,5],
[6,6,6],
[7,7,7]]

#To be reshaped into
[
[1,1,1,2,2,2,0,0,0],
[4,4,4,5,5,5,6,6,6],
[7,7,7,0,0,0,0,0,0]]

#Argument could be of the form: rows to pad
[1 0 2]

#Or its complementary: sequence length
[2 3 1]

有人知道怎么做吗?

一种方法是在正确的位置在初始张量中插入一些零行,然后使用简单的tf.reshape。但我不知道如何插入零行。

另一种方法是直接重塑。我也不知道该怎么做。

1 个答案:

答案 0 :(得分:1)

这应该做,并且易于扩展(例如,使用不同种类的填充等)。如果它按预期工作,请告诉我!

import tensorflow as tf

def split_and_pad_tensor(tensor, lengths):
    """
    Input: a rank 2 tensor of shape (A,B) and a collection of indexes that
    sum up to A (otherwise tf.split crashes).
    The tensor is then split in len(lengths) tensors of the given lengths,
    and then each splitted tensor is zero-padded at the right until all have
    B*max(idxs) elements. Output is then a rank 2 tensor of shape
    (len(idxs), B*max(idxs))
    """
    length_result, max_length = len(lengths), max(lengths)
    splitted = tf.split(tensor, lengths, 0)
    # pad's second argument can be seen as [[left, right], [up, down]]
    padded = tf.stack([tf.pad(s, [[0,max_length-l],[0,0]]) for l,s in zip(lengths, splitted)])
    # flatten last two axes:
    return tf.reshape(padded, [length_result, tf.shape(tensor)[1]*max_length])

# make some data and test for different valid inputs:
DATA = tf.constant([[x,x,x] for x in [1,2,4,5,6,7]])
with tf.Session() as sess:
    for lengths in ([4,2], [2,3,1], [2,2,1,1]):
        print sess.run(split_and_pad_tensor(DATA, lengths))

输出:

[[1 1 1 2 2 2 4 4 4 5 5 5]
 [6 6 6 7 7 7 0 0 0 0 0 0]]
[[1 1 1 2 2 2 0 0 0]
 [4 4 4 5 5 5 6 6 6]
 [7 7 7 0 0 0 0 0 0]]
[[1 1 1 2 2 2]
 [4 4 4 5 5 5]
 [6 6 6 0 0 0]
 [7 7 7 0 0 0]]

带有占位符的Pure-TF版本:

以下代码具有与上述相同的功能,但输入是占位符,而tf.map_fn + tf.gather组合用于实现完整的形状动态:

import tensorflow as tf

class SplitAndPadGraph(object):
    def __init__(self):
        # minimal assumptions on the placeholderes' shapes
        data_ph = tf.placeholder(tf.float32, shape=[None, None])
        lengths_ph = tf.placeholder(tf.int32, shape=[None])
        # extract information about input shapes
        data_len = tf.shape(data_ph)[0]
        out_dim0 = tf.shape(lengths_ph)[0]
        out_dim1 = tf.reduce_max(lengths_ph)
        out_dim2 = tf.shape(data_ph)[-1]
        # create a [[x,y,z], ...] tensor, where x=start_idx, y=length, z=pad_size
        start_idxs = tf.concat([[0], tf.cumsum(lengths_ph)], 0)[:-1]
        pads = tf.fill([out_dim0], out_dim1)-lengths_ph
        reconstruction_metadata = tf.stack([start_idxs, lengths_ph, pads], axis=1)
        # pass the xyz tensor to map_fn to create a tensor with the proper indexes.
        # then gather the indexes from data_ph and reshape
        reconstruction_data = tf.map_fn(lambda x: tf.concat([tf.range(x[0],x[0]+x[1]),
                                                             tf.fill([x[2]], data_len)],
                                                            0), reconstruction_metadata)
        output = tf.gather(tf.concat([data_ph, tf.zeros((1,out_dim2))], 0),
                           tf.reshape(reconstruction_data, [out_dim0*out_dim1]))
        output = tf.reshape(output, [out_dim0, out_dim1*out_dim2])
        # graph interface to access input and output nodes from outside
        self.data_ph = data_ph
        self.lengths_ph = lengths_ph
        self.output = output

DATA = [[x,x,x] for x in [1,2,4,5,6,7]]
g = SplitAndPadGraph()
with tf.Session() as sess:
    for lengths in [[4,2], [2,3,1], [2,2,1,1]]:
        print "lengths =", lengths
        print sess.run(g.output, feed_dict={g.data_ph:DATA, g.lengths_ph:lengths})

干杯! 安德烈