我试图在tensor2tensor中创建一个简单的基于LSTM的RNN。
到目前为止,培训似乎仍然有效,但是我无法恢复模型。尝试这样做会抛出NotFoundError
,指出LSTM的一个偏置节点:
NotFoundError: ..
Key bidirectional/backward_lstm/bias not found in checkpoint
我不知道为什么会这样。
这实际上应该是另一个问题的解决方法,我可以使用来自tensor2tensor(https://github.com/tensorflow/tensor2tensor/issues/1616)的LSTM来解决类似的问题。
$ pip freeze | grep tensor
mesh-tensorflow==0.0.5
tensor2tensor==1.12.0
tensorboard==1.12.0
tensorflow-datasets==1.0.2
tensorflow-estimator==1.13.0
tensorflow-gpu==1.12.0
tensorflow-metadata==0.9.0
tensorflow-probability==0.5.0
def body(self, features):
inputs = features['inputs'][:,:,0,:]
hparams = self._hparams
problem = hparams.problem
encoders = problem.feature_info
max_input_length = 350
max_output_length = 350
encoder = Bidirectional(LSTM(128, return_sequences=True, unroll=False), merge_mode='concat')(inputs)
encoder_last = encoder[:, -1, :]
decoder = LSTM(256, return_sequences=True, unroll=False)(inputs, initial_state=[encoder_last, encoder_last])
attention = dot([decoder, encoder], axes=[2, 2])
attention = Activation('softmax', name='attention')(attention)
context = dot([attention, encoder], axes=[2, 1])
concat = concatenate([context, decoder])
return tf.expand_dims(concat, 2)
NotFoundError (see above for traceback): Restoring from checkpoint failed. This is most likely due to a Variable name or other graph key that is missing from the checkpoint. Please ensure that you have not altered the graph expected based on the checkpoint. Original error:
Key while/lstm_keras/parallel_0_4/lstm_keras/lstm_keras/body/bidirectional/backward_lstm/bias not found in checkpoint
[[node save/RestoreV2 (defined at /home/sfalk/tmp/pycharm_project_265/asr/model/persistence.py:282) = RestoreV2[dtypes=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]]
可能是什么问题,以及如何解决此问题?
答案 0 :(得分:1)
这似乎与https://github.com/tensorflow/tensor2tensor/issues/1486有关。在使用tensor2tensor从检查点恢复的过程中,似乎在关键字名称之前加上了“ while”。似乎是一个未解决的错误,您的输入将在github上得到赞赏。
如果可以的话,我会发表评论,但是我的声誉太低了。干杯。