火炬代码:
up = nn.ConvTranspose2d(3, 128, 2, stride=2)
conv = nn.Conv2d(3, 128, 2)
inputs = Variable(torch.rand(1, 3, 64, 64))
print('up conv output size:', up(inputs).size())
inputs = Variable(torch.rand(1, 3, 64, 64))
print('conv output size:', conv(inputs).size())
print('up conv weight size:', up.weight.data.shape)
print('conv weight size:', conv.weight.data.shape)
结果:
up conv output size: torch.Size([1, 128, 128, 128])
conv output size: torch.Size([1, 128, 63, 63])
up conv weight size: torch.Size([3, 128, 2, 2])
conv weight size: torch.Size([128, 3, 2, 2])
为什么ConvTranspose2d (3,128)
和Conv2d (128, 3)
之间的顺序不同?
它应该表现得像这样吗?