具有多个输出的Keras LSTM的自定义loss_weights

时间:2019-02-19 14:59:23

标签: keras lstm loss-function

我正在与Keras-LSTM一起使用相同的输入数据来预测两个不同的输出。我下面的模型可以正常工作。我的问题是,是否有办法在loss_weights行中修改model.compile

使用当前选项,最终损失定义为0.5 x loss_output1 + 0.5 x loss_output2,但是,我要计算的是sqrt(loss_output1^2 + loss_output1^2)这样的最终损失。

def rmse_loss(y_true, y_pred):
     return (K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1)))/K.mean(y_true) 

inputs=Input((data_tensor["X"].shape[1], data_tensor["X"].shape[2]), name='input')

model= LSTM(units=nHid, return_sequences=True)(inputs)
model= Dropout(dropout)(model)
model= LSTM(units=nHid, activation='linear')(model)
model= Dropout(dropout)(model)

output1= Dense(activation="relu", output_dim=1)(model)
output2= Dense(activation="relu", output_dim=1)(model)

model=Model(inputs=[inputs], outputs=[output1, output2])

# complie model
model.compile(loss=[rmse_loss, rmse_loss], optimizer=Adam, loss_weights=[0.5,0.5])

0 个答案:

没有答案