使用Keras的自定义损失函数来惩罚更多负面预测

时间:2018-07-23 14:20:02

标签: function keras loss

我了解mse将以相同的方式对待实际(预测)和预测(实际)。我想编写一个自定义损失函数 预测>实际的代价大于实际>预测的代价 假设由于预测>实际,我将受到2倍的罚款。我将如何实现这样的功能

import numpy as np

from keras.models import Model
from keras.layers import Input

import keras.backend as K
from keras.engine.topology import Layer
from keras.layers.core import  Dense

from keras import objectives

def create_model():
    # define the size
    input_size = 6
    hidden_size = 15;
    # definte the model
    model = Sequential()
    model.add(Dense(input_size, input_dim=input_size, kernel_initializer='normal', activation='relu'))
    model.add(Dense(hidden_size, kernel_initializer='normal', activation='relu'))
    model.add(Dense(1, kernel_initializer='normal'))

    # mse is used as loss for the optimiser to converge quickly
    # mae is something you can quantify the manitude
    model.compile(optimizer='adam', loss='mse', metrics=['mae'])

    return model

early_stop = EarlyStopping(monitor='val_loss', patience=20)
history = model.fit(train_features, train_label, epochs=200, validation_split=0.2, verbose=0, shuffle=True)
predvalue = model.predict(test_features).flatten() * 100

如何实现这种损失功能?

1 个答案:

答案 0 :(得分:0)

def customLoss(true,pred):
    diff = pred - true

    greater = K.greater(diff,0)
    greater = K.cast(greater, K.floatx()) #0 for lower, 1 for greater
    greater = greater + 1                 #1 for lower, 2 for greater

    #use some kind of loss here, such as mse or mae, or pick one from keras
    #using mse:
    return K.mean(greater*K.square(diff))

model.compile(optimizer = 'adam', loss = customLoss)
相关问题