Tensorflow Lambda层可区分批次之间的预测

时间:2020-10-15 23:09:47

标签: tensorflow keras lstm

我正在研究LSTM递归神经网络,想区分我拥有的最后一个致密层的输出/预测。

考虑一个简单的网络-

data_in = Input(shape=(10, 4)) # 10 time steps and 4 features
x = LSTM(4, activation='relu')(data_in)
x = Dense(32)(x)
out = Dense(1)(x)

我想实现的目标有些棘手,用简单的代码来实现,就像这样-

if batch_no == 1:
    # for the first batch, since no history is available, assume to start wiht 0 --> prepend=0
    final_out = np.diff(out, 1, prepend=0)
    # hold on to the final prediction in batch to differntiate the output of next batch
    carry_forward = out[-1]          
else:
    # use the final prediction from previous batch to differentiate the output of the current batch
    final_out = np.diff(out, 1, prepend=carry_forward)
    # again hold on to the final prediction to use for next batch
    carry_forward = out[-1]

到目前为止,我已经尝试过了-

  1. Lambda层看起来很有希望,但是我不确定如何在此处使用“ carry_forward”。我可以做这样的事情,但是有一个折衷方案,就是将一个特定的预测(每个批次的第一个)与无关的值区分开来
final_out = Lambda(lambda m: m - tf.roll(m, 1, axis=0))(out)
  1. 与我已经写过的if-else非常相似,但是具有张量运算-
if batch_no == 1:
    subtract = tf.identity(out)
    subtract = tf.roll(subtract, 1, axis=0)
    subtract[0] = 0                           # doesn't work, cause of eager tensor not being enabled
    final_out = out - subtract
    carry_forward = out[-1].numpy()        
else:
    subtract = tf.identity(out)
    subtract = tf.roll(subtract, 1, axis=0)
    subtract[0] = carry_forward               # doesn't work, cause of eager tensor not being enabled
    final_out = out - subtract
    carry_forward = out[-1]
注意

我无法在此处使用tf.experimental.numpy.diff,因为我正在tensorflow 2.1.0上运行

这使我想起了问题-

  1. 启用渴望的张量有副作用吗?
  2. 如何实现?

任何指针都将非常有帮助

0 个答案:

没有答案