在第一维度不同的情况下,在轴= 3上连接数组

时间:2018-04-30 13:07:00

标签: python-3.x numpy concatenation

我有大约20种不同的数据形状。我想在axis=3

上连接它们
data_1=dim(1000,150,10)
data_2=dim(1000,150,10)
data_3=dim(1000,150,10)
data_4=dim(1000,150,10)

features_1=dim(1000,150,10)
features_2=dim(1000,150,10)
features_3=dim(1000,150,10)
features_4=dim(1000,150,10)

将它们连接到datafeatures变量

因此

data.shape= (4,1000,150,10)

features.shape=(4,1000,150,10)

我想做什么?

在名为data_concat

的变量中连接 axis = 3 上的数据和要素

以便data_concat.shape=(4,1000,150,20)

这样做我做了以下事情:

data_concat = np.concatenate((data,features),axis=3)

然而,当第一个维度不相同时它不起作用。例如:

data_1=dim(1000,150,10)
data_2=dim(1200,150,10)
data_3=dim(800,150,10)
data_4=dim(400,150,10)

features_1=dim(1000,150,10)
features_2=dim(1200,150,10)
features_3=dim(800,150,10)
features_4=dim(400,150,10)

因此

data.shape= (4,)

features.shape=(4,)

做:

data_concat = np.concatenate((data,features),axis=3)

不起作用,因为连接不会看到轴= 3,因为

data.shape =(4,)

features.shape=(4,)

谢谢

2 个答案:

答案 0 :(得分:1)

由于它们所有的数学假设,numpy数组必须具有明确定义的形状。如果不是这样,numpy定义了一个列表数组,就像你的第二个例子一样:正如你所注意到的那样,你不能在np.concatenate上使用axis=3,因为数组被视为一维的。

也许,如果您将每个data变量与其对应的features变量分开连接,您可以更接近您的意图,如

df_1 = np.concatenate((data_1, features_1), axis=2)
df_2 = np.concatenate((data_2, features_2), axis=2)
df_3 = np.concatenate((data_3, features_3), axis=2)
df_4 = np.concatenate((data_4, features_4), axis=2)

data = [df_1, df_2, df_3, df_4]

但是,从您的数据中,我注意到第二维和第三维始终相同。这对我来说就像你试图将包含相同数据的几个不同长度的批次组合在一起。如果是这种情况,为什么不连接data_1data_2 ecc。在第0轴?这对numpy没有任何问题。

答案 1 :(得分:1)

这可以通过列表理解或结果应该是np.frompyfunc的数组来完成:

# create example
>>> data = np.array([np.arange(n*12).reshape(n, 2, 6) for n in range(2, 5)])
>>> features = np.array([np.ones((n, 2, 6), int) for n in range(2, 5)])
>>> data.shape, features.shape
((3,), (3,))
>>> 
# list comprehension
>>> [np.concatenate(xy, 2) for xy in zip(data, features)]
[array([[[ 0,  1,  2,  3,  4,  5,  1,  1,  1,  1,  1,  1],
        [ 6,  7,  8,  9, 10, 11,  1,  1,  1,  1,  1,  1]],

       [[12, 13, 14, 15, 16, 17,  1,  1,  1,  1,  1,  1],
        [18, 19, 20, 21, 22, 23,  1,  1,  1,  1,  1,  1]]]), array([[[ 0,  1,  2,  3,  4,  5,  1,  1,  1,  1,  1,  1],
        [ 6,  7,  8,  9, 10, 11,  1,  1,  1,  1,  1,  1]],

       [[12, 13, 14, 15, 16, 17,  1,  1,  1,  1,  1,  1],
        [18, 19, 20, 21, 22, 23,  1,  1,  1,  1,  1,  1]],

       [[24, 25, 26, 27, 28, 29,  1,  1,  1,  1,  1,  1],
        [30, 31, 32, 33, 34, 35,  1,  1,  1,  1,  1,  1]]]), array([[[ 0,  1,  2,  3,  4,  5,  1,  1,  1,  1,  1,  1],
        [ 6,  7,  8,  9, 10, 11,  1,  1,  1,  1,  1,  1]],

       [[12, 13, 14, 15, 16, 17,  1,  1,  1,  1,  1,  1],
        [18, 19, 20, 21, 22, 23,  1,  1,  1,  1,  1,  1]],

       [[24, 25, 26, 27, 28, 29,  1,  1,  1,  1,  1,  1],
        [30, 31, 32, 33, 34, 35,  1,  1,  1,  1,  1,  1]],

       [[36, 37, 38, 39, 40, 41,  1,  1,  1,  1,  1,  1],
        [42, 43, 44, 45, 46, 47,  1,  1,  1,  1,  1,  1]]])]

# frompyfunc
>>> np.frompyfunc(lambda *xy: np.concatenate(xy, 2), 2, 1)(data, features)
array([array([[[ 0,  1,  2,  3,  4,  5,  1,  1,  1,  1,  1,  1],
        [ 6,  7,  8,  9, 10, 11,  1,  1,  1,  1,  1,  1]],

       [[12, 13, 14, 15, 16, 17,  1,  1,  1,  1,  1,  1],
        [18, 19, 20, 21, 22, 23,  1,  1,  1,  1,  1,  1]]]),
       array([[[ 0,  1,  2,  3,  4,  5,  1,  1,  1,  1,  1,  1],
        [ 6,  7,  8,  9, 10, 11,  1,  1,  1,  1,  1,  1]],

       [[12, 13, 14, 15, 16, 17,  1,  1,  1,  1,  1,  1],
        [18, 19, 20, 21, 22, 23,  1,  1,  1,  1,  1,  1]],

       [[24, 25, 26, 27, 28, 29,  1,  1,  1,  1,  1,  1],
        [30, 31, 32, 33, 34, 35,  1,  1,  1,  1,  1,  1]]]),
       array([[[ 0,  1,  2,  3,  4,  5,  1,  1,  1,  1,  1,  1],
        [ 6,  7,  8,  9, 10, 11,  1,  1,  1,  1,  1,  1]],

       [[12, 13, 14, 15, 16, 17,  1,  1,  1,  1,  1,  1],
        [18, 19, 20, 21, 22, 23,  1,  1,  1,  1,  1,  1]],

       [[24, 25, 26, 27, 28, 29,  1,  1,  1,  1,  1,  1],
        [30, 31, 32, 33, 34, 35,  1,  1,  1,  1,  1,  1]],

       [[36, 37, 38, 39, 40, 41,  1,  1,  1,  1,  1,  1],
        [42, 43, 44, 45, 46, 47,  1,  1,  1,  1,  1,  1]]])], dtype=object)