ValueError:形状(无,1)和(无,64、32)不兼容

时间:2020-10-05 19:28:27

标签: python tensorflow keras lstm

我是深度学习的新手,因此我正在此网站上进行教程: https://www.datacamp.com/community/tutorials/using-tensorflow-to-compose-music#key-transposition

设法完成自动编码器和VAE。但是本教程的最后一部分是LSTM,我遇到了这个问题,我尝试匹配尺寸,但找不到解决方案。 有人可以用模型摘要解释如何调试维度检查吗?

    ValueError: Shapes (None, 1) and (None, 64, 32) are incompatible

这是相关的数据和维度:

#Convert data to numpy array of type float
trainChords = np.array(trainChords, np.float) #shape (3231,32)
trainDurations = np.array(trainDurations, np.float) #shape (3231,32)
targetChords = np.array(targetChords, np.float).reshape(-1,1) #shape (3231,1)
targetDurations = np.array(targetDurations, np.float).reshape(-1,1) #shape (3231,1)
#print(trainChords.shape, trainDurations.shape, targetChords.shape, targetDurations.shape)


# Define number of samples, notes and chords, and durations
nSamples = trainChords.shape[0]
nChords = trainChords.shape[1]
nDurations = trainDurations.shape[1]

# Set the input dimension
inputDim = nChords * sequenceLength

# Set the embedding layer dimension
embedDim = 64

# Define input layers
chordInput = tf.keras.layers.Input(shape = (None,))
durationInput = tf.keras.layers.Input(shape = (None,))

# Define embedding layers
chordEmbedding = tf.keras.layers.Embedding(nChords, embedDim, input_length = sequenceLength)(chordInput)
durationEmbedding = tf.keras.layers.Embedding(nDurations, embedDim, input_length = sequenceLength)(durationInput)

# Merge embedding layers using a concatenation layer
mergeLayer = tf.keras.layers.Concatenate(axis=1)([chordEmbedding, durationEmbedding])

# Define LSTM layer
lstmLayer = tf.keras.layers.LSTM(512, return_sequences=True)(mergeLayer)

# Define dense layer
denseLayer = tf.keras.layers.Dense(256)(lstmLayer)

# Define output layers
chordOutput = tf.keras.layers.Dense(nChords, activation = 'softmax')(denseLayer)
durationOutput = tf.keras.layers.Dense(nDurations, activation = 'softmax')(denseLayer)

# Define model
lstm = tf.keras.Model(inputs = [chordInput, durationInput], outputs = [chordOutput, durationOutput])

# Compile the model
lstm.compile(loss='categorical_crossentropy', optimizer='rmsprop')

# Train the model
lstm.fit([trainChords, trainDurations], [targetChords, targetDurations], epochs=100, batch_size=64)

模型摘要:

Model: "model_2"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_5 (InputLayer)            [(None, None)]       0                                            
__________________________________________________________________________________________________
input_6 (InputLayer)            [(None, None)]       0                                            
__________________________________________________________________________________________________
embedding_4 (Embedding)         (None, None, 64)     2048        input_5[0][0]                    
__________________________________________________________________________________________________
embedding_5 (Embedding)         (None, None, 64)     2048        input_6[0][0]                    
__________________________________________________________________________________________________
concatenate_2 (Concatenate)     (None, None, 64)     0           embedding_4[0][0]                
                                                                 embedding_5[0][0]                
__________________________________________________________________________________________________
lstm_2 (LSTM)                   (None, None, 512)    1181696     concatenate_2[0][0]              
__________________________________________________________________________________________________
dense_6 (Dense)                 (None, None, 256)    131328      lstm_2[0][0]                     
__________________________________________________________________________________________________
dense_7 (Dense)                 (None, None, 32)     8224        dense_6[0][0]                    
__________________________________________________________________________________________________
dense_8 (Dense)                 (None, None, 32)     8224        dense_6[0][0]                    
==================================================================================================
Total params: 1,333,568
Trainable params: 1,333,568
Non-trainable params: 0
__________________________________________________________________________________________________

0 个答案:

没有答案