我具有以下自动编码器体系结构(Pytorch):
channels = 12
# encoder
self.conv0 = nn.Conv2d(1, channels, kernel_size=3, stride=1)
self.mp0 = nn.MaxPool2d((2, 2), stride=2, return_indices=True)
self.conv1 = nn.Conv2d(channels, channels * 2, padding=1, kernel_size=3, stride=1)
self.mp1 = nn.MaxPool2d((3, 3), stride=2, return_indices=True)
self.conv2 = nn.Conv2d(channels * 2, channels * 2, kernel_size=3, stride=1)
# decoder
self.convtransp0 = nn.ConvTranspose2d(channels * 2, channels * 2, kernel_size=3, stride=1)
self.mup0 = nn.MaxUnpool2d((2, 2), stride=2)
self.convtransp1 = nn.ConvTranspose2d(channels * 2, channels, kernel_size=3, stride=1)
self.mup1 = nn.MaxUnpool2d((2, 2), stride=2)
self.convtransp2 = nn.ConvTranspose2d(channels, 1, kernel_size=3, stride=1)
我打算使其对称。我的图片尺寸为512x512。 当我尝试运行它时,由于convtransp1和mup1之间的尺寸不匹配而出现异常:torch.Size([5,12,256,256])和torch.Size([5,12,255,255])。我的问题是关于解决这种失配的普遍态度?
目前,我正在采用最简单的方法:
def forward(self, x):
...
y = self.convtransp1(y)
y = y[:, :, :255, :255]
y = F.relu(y)
y = self.mup1(y, ixs0)
...