我有以下型号:
import keras
from keras.layers import Input, Dense
from keras.models import Model
# Joint input layer for both model A and B
inputs = Input(shape=(12,))
# ---------------------------------------
# model_A
x = Dense(64, activation='relu')(inputs)
x = Dense(64, activation='relu')(x)
predictions_A = Dense(3, activation='softmax')(x)
model_A = Model(inputs=inputs, outputs=predictions_A)
# ---------------------------------------
# model_B
inputs_B = keras.layers.concatenate([inputs, predictions_A])
x1 = Dense(64, activation='relu')(inputs_B)
x1 = Dense(64, activation='relu')(x1)
predictions_B = Dense(1, activation='sigmoid')(x1)
model_B = Model(inputs=inputs, outputs=predictions_B)
两个模型的损失函数是:
model_A.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
model_B.compile(loss='mean_squared_error', optimizer='adam')
我能够分别训练两个模型,如下所示:
model_A.fit(my_data_x, pd.get_dummies(my_data['target_categorical'],prefix=['cate_']))
model_B.fit(my_data_x, my_data_y)
代码正在运行,但这并不是我想要的。 我希望“同时”训练model_A和model_B。也就是说,model_A使用自己的交叉熵损失函数,同时考虑到model_B的反向传播误差。这有可能吗?
答案 0 :(得分:1)
您需要具有两个输出的单个模型:
T&