我正在基于this纸的模型上工作,由于GlobalMaxPooling1D
层不支持遮罩,我遇到了异常。
我有一个Embedding
层,其中mask_zero
自变量设置为True
。但是,由于随后的GlobalMaxPooling1D
层不支持屏蔽,因此我遇到了一个例外。可以预料到会有例外,因为the documentation of the Embedding layer中实际上指出,具有Embedding
的{{1}}层之后的任何后续层都应支持屏蔽。
但是,由于我要处理其中的单词数目可变的句子,因此确实需要mask_zero = True
层中的遮罩。 (即由于输入长度的变化)我的问题是,我应该如何更改我的模型以使掩模仍然是模型的一部分,并且不会在Embedding
层上引起问题?
下面是模型的代码。
GlobalMaxPooling1D
答案 0 :(得分:1)
However, as I am processing sentences with variable number of words in them, I do need the masking in the Embedding layer.
Are you padding the sentences to make them have equal lengths? If so, then instead of masking, you can let the model find out on its own that the 0 is padding and therefore should be ignored. Therefore, you would not need an explicit masking. This approach is also used for dealing with missing values in the data as suggested in this answer.