在深度学习keras中将中间层添加到损失函数中

时间:2018-06-02 17:05:26

标签: python tensorflow keras deep-learning autoencoder

我想在keras中使用DNN模型VAE的中间层使用自定义丢失功能。我调用模型函数,然后将损失添加到图层。

错误是:

ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval

model被编译但在训练期间导致错误

#Below is the code which causes error
#this returns the models (en,de, model) and layers (z_mean and #z_log_sigma)
en,de,model,z_mean,z_log_sigma = load_model(config)
#defining loss using intermediate layers returned
kl_loss = - 0.5 * K.mean(1 + z_log_sigma - K.square(z_mean) - 
                      K.exp(z_log_sigma), axis=-1)

model.add_loss(kl_loss)
model.compile( optimizer=optimizer)
#error is raised during training 
history = model.fit_generator(
                    genfun,
                    steps_per_epoch = display_interval,
                    epochs = 1,
                    shuffle=False,
                    verbose = 1
                ) #callbacks=[eval_map])    

1 个答案:

答案 0 :(得分:1)

此解决方案适用于设计自定义丢失层。

en,de,model,z_mean,z_log_sigma = load_model(config)

def custom_loss_wrapper(z_mean=z_mean,z_log_sigma=z_log_sigma):
    def loss(y_true, y_pred):
        xent_loss = binary_crossentropy(y_true, y_pred)
        kl_loss = - 0.5 * K.mean(1 + z_log_sigma - K.square(z_mean) - 
                    K.exp(z_log_sigma), axis=-1)
        return  xent_loss+kl_loss
    return loss
model.compile( 
optimizer=optimizer,loss=custom_loss_wrapper(z_mean,z_log_sigma))