我正在尝试使用带有Keras提供的功能API的双向LSTM为NER任务创建基线模型
我使用的嵌入层是100维特征向量
该层的输入是填充的长度序列
MAX_LEN = 575
(注意:输入和输出的尺寸相同)
我希望在每个时间步长都有一个输出,因此我已经设置了
return_sequences = True
输出只是通过 soft-max 层
传递的激活但是在编译模型时,我不断收到此警告
UserWarning: Model inputs must come from `keras.layers.Input`
(thus holding past layer metadata), they cannot be the output of a
previous non-Input layer. Here, a tensor specified as input to your model was
not an Input tensor, it was generated by layer embedding_3.
Note that input tensors are instantiated via `tensor = keras.layers.Input(shape)`.
The tensor that caused the issue was: embedding_3_40/embedding_lookup/Identity:0 str(x.name))
伴随
AssertionError:
跟踪:
---> 37 model = Model(inputs = nn_input, outputs = nn_output)
---> 91 return func(*args, **kwargs)
---> 93 self._init_graph_network(*args, **kwargs)
222 # It's supposed to be an input layer, so only one node
223 # and one tensor output.
--> 224 assert node_index == 0
我尝试调试代码以检查尺寸,但看起来与代码中的注释所突出显示的尺寸相匹配
nn_input = Input(shape = (MAX_LEN,) , dtype = 'int32')
print(nn_input.shape) #(?, 575)
nn_input = embedding_layer(nn_input)
print(nn_input.shape) #(?, 575, 100)
nn_out, forward_h, forward_c, backward_h, backward_c = Bidirectional(LSTM(MAX_LEN, return_sequences = True, return_state = True))(nn_input)
print(forward_h.shape) #(?, 575)
print(forward_c.shape) #(?, 575)
print(backward_h.shape) #(?, 575)
print(backward_c.shape) #(?, 575)
print(nn_out.shape) #(?, ?, 1150)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])
print(state_h.shape) #(?, 1150)
print(state_c.shape) #(?, 1150)
densor = Dense(100, activation='softmax')
nn_output = densor(nn_out)
print(nn_output.shape) #(?, 575, 100)
model = Model(inputs = nn_input, outputs = nn_output)
对于某些人来说这似乎微不足道,但是我担心我对LSTM或至少Keras的理解存在缺陷
如有必要,我会在修改中提供其他详细信息
任何帮助将不胜感激!
答案 0 :(得分:1)
如错误所示,您必须将张量(即层keras.layers.Input的输出)传递给Model API。在这种情况下,张量nn_input是embedding_layer的输出。将用于将embedding_layer的输出分配给nn_input的变量名称更改为其他名称。
nn_input = Input(shape = (MAX_LEN,) , dtype = 'int32')
# the line below is the cause of the error. Change the output variable name to like nn_embed.
nn_input = embedding_layer(nn_input)