一维卷积自动编码器(Keras)中的填充问题

时间:2019-04-12 21:09:54

标签: keras conv-neural-network padding autoencoder

我正在一些图上尝试卷积自动编码器。但是,我遇到了解码信号的零边沿问题(见图)

Ground truth graph (left) and decoded graph (right)

我认为这是由于填充。使用零填充时,在编码和解码时,这些滤波器在边缘产生的值较小。因此,解码信号的边缘接近零。

有什么办法可以解决这个问题?我可以减去信号的均值以减少问题。但是,这是一个重要的数字,我希望成为编码值的一部分(不绕开卷积)。高值的波动可能意味着与低值的相同波动有所不同。

我尝试了因果填充类型,尽管该问题似乎失去了对称性(末端看起来更好),但该现象仍然存在。我减小了过滤器的尺寸并消除了膨胀,这似乎有所帮助。但是,问题仍然存在。

有帮助吗? :-)

代码:

class Autoencoder(object):
    (...)
    padding = 'same'
    filter_sizes = [3,3,3,3,3,3]
    (...)
    def build_model(self):
        self.inputs = Input(shape=(intSampleLength,1))
        x = Conv1D(50, self.filter_sizes[0], padding=self.padding, input_shape=(dim1,1), activation=keras.layers.LeakyReLU(alpha=0.2))(self.inputs)
        x = Conv1D(80, self.filter_sizes[1], padding=self.padding, strides=2, activation=keras.layers.LeakyReLU(alpha=0.2))(x)        
        x = Conv1D(100, self.filter_sizes[2], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        x = Conv1D(100, self.filter_sizes[3], padding=self.padding, strides=2, activation=keras.layers.LeakyReLU(alpha=0.2))(x)        
        x = Conv1D(100, self.filter_sizes[4], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        self.encoded = Conv1D(100, self.filter_sizes[5], padding=self.padding, strides=2, activation=keras.layers.LeakyReLU(alpha=0.2))(x)

        x = Conv1D(100, self.filter_sizes[5], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(self.encoded)
        x = Conv1D(100, self.filter_sizes[4], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        x = UpSampling1D(2)(x)
        x = Conv1D(100, self.filter_sizes[3], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        x = Conv1D(100, self.filter_sizes[2], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        x = UpSampling1D(2)(x)
        x = Conv1D(80, self.filter_sizes[1], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        x = Conv1D(50, self.filter_sizes[0], padding=self.padding, activation=keras.layers.LeakyReLU(alpha=0.2))(x)
        x = UpSampling1D(2)(x)
        decoded = Conv1D(1,3,padding=self.padding, activation=None)(x)
        self.autoencode_model=Model(inputs=self.inputs,outputs = decoded)

        return self.autoencode_model

0 个答案:

没有答案