如何在Keras中将(None,10)维张量重塑为(None,None,10)?

时间:2019-04-27 19:05:51

标签: python machine-learning keras neural-network recurrent-neural-network

我正在尝试将可变大小的序列输入LSTM。因此,我使用的是生成器,批处理大小为1。
我有一个(sequence_length,)输入张量,该张量已嵌入,并输出(batch_size, sequence_length, embeding_dimension)张量。
并行地,我拥有的其他输入数据的大小为(sequence_length, features),即(None, 10),我想将其重塑为(batch_size, sequence_length, features),即(None, None, 10)

我尝试将Keras Reshape-Layer与target_shape (-1, 10)一起使用,这相当于将(None, 10)展开为(None, None, 10),但是我收到的是一个(None, 1, 10)张量,这使得无法将其与嵌入的数据连接起来以将其馈送到LSTM。
我的代码:

cluster = Input(shape=(None,))
embeded = Embedding(115, 25, input_length = None)(cluster)

features = Input(shape=(10,)) #variable
reshape = Reshape(target_shape=(-1, 10))(features)

merged = Concatenate(axis=-1)([embeded, reshape])

[...]

model.fit_generator(generator(), steps_per_epoch=1, epochs=5)

输出:

[...]
ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, None, 25), (None, 1, 10)]

如何在Keras中将(None, 10)重塑为(None, None, 10)张量?

1 个答案:

答案 0 :(得分:1)

与在NumPy中进行重塑相比,这样做Keras不会有任何好处。您可以:

# perform reshaping prior to passing to Keras
features = Input(shape=(None, 10))

,并在传递给Keras之前执行重塑,在此处输入中实际有batch_sizesequence_length可用。