我使用了贝叶斯lstm的代码,方法是使用纸质贝叶斯层:神经网络不确定性模块中的edward2库 :
lstm=ed.layers.LSTMCellReparameterization(8)
output_layer=tf.keras.layers.Dense(4)
def loss_fn(x,lable,datasetsize):
state = lstm.get_initial_state(x)
nll = 0.
for t in range(x.shape[0]):
net, state = lstm(x, state)
logits = output_layer(net)
nll += tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
lable, logits=logits))
k1 = sum(lstm.losses) / datasetsize
loss=nll+k1
returnloss
loss1=loss_fn(b1,Y,2000)
我想使用此代码来训练神经网络。有人可以帮我吗?