我想在keras中编写一个自定义损失函数,该函数还使用输出与输入的梯度
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Dropout
model = Sequential()
model.add(Dense(200, input_dim=20, activation='relu'))
model.add(Dense(64, input_dim=20, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy',
optimizer='rmsprop',
metrics=['accuracy'])
现在,我要替换损失函数。我只知道如何使用tensorflow编写如下:
x = tf.placeholder(tf.float32, shape=[None, 1], name="x")
ext_f0 = tf.placeholder(tf.float32, shape=[None, 1], name="f")
xx0 = tf.concat([[[R_variable['x_start'] * 1.0],[R_variable['x_end'] * 1.0]], x], 0)
yy0 = univAprox(xx0)
Boundary_y = yy0[0:2]
y = yy0[2:]
GradU = tf.gradients(y,x) #***KEY***
GradSum = tf.reduce_sum(tf.square(GradU)) / 2 * dx_train
FSum = tf.reduce_sum(tf.multiply(ext_f0,y)) * dx_train
loss = GradSum + FSum + Beta * tf.reduce_sum(tf.square(Boundary_y))
关键是我需要获得GradU
。我该怎么做在喀拉拉邦?
答案 0 :(得分:0)
您可以编写自定义损失函数。您应该知道的是它的形式:
def custom_loss(y_true, y_pred):
# do something to compute loss
...
return loss
就应该这样。