我正在训练自动编码器,并在调整自动编码器后,将编码器作为一部分,为回归问题添加一个完全连接的层。训练自动编码器后,当我试图编码器和完全连接层结合,它示出了下面的错误
ValueError: Dimension 1 in both shapes must be equal, but are 128 and 64. Shapes are [32,128] and [32,64]. for 'Assign_36' (op: 'Assign') with input shapes: [32,128], [32,64].
我了解某个地方的尺寸不匹配,但是我无法识别。对不起,我是新来这。这是相关的代码。
def encoder(input_data):
encoder1 = Dense(128, activation = 'relu')(input_data)
encoder1 = BatchNormalization()(encoder1)
encoder1 = Dense(128, activation = 'relu')(encoder1)
encoder1 = BatchNormalization()(encoder1)
encoder2 = Dense(64, activation = 'relu')(encoder1)
encoder2 = BatchNormalization()(encoder2)
encoder2 = Dense(64, activation = 'relu')(encoder2)
encoder2 = BatchNormalization()(encoder2)
encoder3 = Dense(32, activation = 'relu')(encoder2)
encoder3 = BatchNormalization()(encoder3)
encoder3 = Dense(32, activation = 'relu')(encoder3)
encoder3 = BatchNormalization()(encoder3)
return encoder3
def decoder(encoder, input_data):
decoder3 = Dense(64, activation = 'relu')(encoder)
decoder3 = BatchNormalization()(decoder3)
decoder3 = Dense(64, activation = 'relu')(decoder3)
decoder3 = BatchNormalization()(decoder3)
decoder2 = Dense(128, activation = 'relu')(decoder3)
decoder2 = BatchNormalization()(decoder2)
decoder2 = Dense(128, activation = 'relu')(decoder2)
decoder2 = BatchNormalization()(decoder2)
decoder1 = Dense(788, activation = 'sigmoid')(decoder2)
return decoder1
def fc(enco):
den = Dense(128, activation = 'relu')(enco)
out = Dense(1, activation = 'tanh')(den)
return out
input_data = Input(shape=(788, ))
autoencoder = Model(input_data, decoder(encoder(input_data), input_data))
autoencoder.compile(optimizer = 'adadelta', loss ='mse')
autoencoder.fit(train_x, train_x,
epochs = 5,
batch_size = 32,
shuffle = True,
validation_data = (test_x, test_x))
# building combined model
encode = encoder(input_data)
full_model = Model(input_data, fc(encode))
full_model.set_weights(autoencoder.get_weights()) # ValueError