我正在研究LSTM递归神经网络,想区分我拥有的最后一个致密层的输出/预测。
考虑一个简单的网络-
data_in = Input(shape=(10, 4)) # 10 time steps and 4 features
x = LSTM(4, activation='relu')(data_in)
x = Dense(32)(x)
out = Dense(1)(x)
我想实现的目标有些棘手,用简单的代码来实现,就像这样-
if batch_no == 1:
# for the first batch, since no history is available, assume to start wiht 0 --> prepend=0
final_out = np.diff(out, 1, prepend=0)
# hold on to the final prediction in batch to differntiate the output of next batch
carry_forward = out[-1]
else:
# use the final prediction from previous batch to differentiate the output of the current batch
final_out = np.diff(out, 1, prepend=carry_forward)
# again hold on to the final prediction to use for next batch
carry_forward = out[-1]
到目前为止,我已经尝试过了-
final_out = Lambda(lambda m: m - tf.roll(m, 1, axis=0))(out)
if batch_no == 1:
subtract = tf.identity(out)
subtract = tf.roll(subtract, 1, axis=0)
subtract[0] = 0 # doesn't work, cause of eager tensor not being enabled
final_out = out - subtract
carry_forward = out[-1].numpy()
else:
subtract = tf.identity(out)
subtract = tf.roll(subtract, 1, axis=0)
subtract[0] = carry_forward # doesn't work, cause of eager tensor not being enabled
final_out = out - subtract
carry_forward = out[-1]
注意
我无法在此处使用tf.experimental.numpy.diff,因为我正在tensorflow 2.1.0上运行
这使我想起了问题-
任何指针都将非常有帮助