无法替换pytorch中Densenet121上的分类器

时间:2019-09-05 13:51:22

标签: neural-network classification conv-neural-network pytorch transfer-learning

我正在尝试使用此github DenseNet121模型(https://github.com/gaetandi/cheXpert.git)进行一些迁移学习。我遇到了将分类层的大小从14调整为2的问题。

github代码的相关部分是:

class DenseNet121(nn.Module):
    """Model modified.
    The architecture of our model is the same as standard DenseNet121
    except the classifier layer which has an additional sigmoid function.
    """
    def __init__(self, out_size):
        super(DenseNet121, self).__init__()
        self.densenet121 = torchvision.models.densenet121(pretrained=True)
        num_ftrs = self.densenet121.classifier.in_features
        self.densenet121.classifier = nn.Sequential(
            nn.Linear(num_ftrs, out_size),
            nn.Sigmoid()
        )
def forward(self, x):
    x = self.densenet121(x)
    return x

我加载并初始化:

# initialize and load the model
model = DenseNet121(nnClassCount).cuda()
model = torch.nn.DataParallel(model).cuda()
modeldict = torch.load("model_ones_3epoch_densenet.tar")
model.load_state_dict(modeldict['state_dict'])

DenseNet似乎没有将层划分为多个子层,因此model = nn.Sequential(*list(modelRes.children())[:-1])无法正常工作。

model.classifier = nn.Linear(1024, 2)似乎可以在默认的DenseNets上使用,但是使用经过修改的分类器(附加的Sigmoid函数),它最终仅添加了一个附加的分类器层而无需替换原始分类器。

我尝试过

model.classifier = nn.Sequential(
    nn.Linear(1024, dset_classes_number), 
    nn.Sigmoid()
)

但是我添加了相同的东西,而不是替换了分类器问题:

...
      )
      (classifier): Sequential(
        (0): Linear(in_features=1024, out_features=14, bias=True)
        (1): Sigmoid()
      )
    )
  )
  (classifier): Sequential(
    (0): Linear(in_features=1024, out_features=2, bias=True)
    (1): Sigmoid()
  )
)

1 个答案:

答案 0 :(得分:0)

如果要替换classifier成员densenet121中的model,则需要分配

model.densenet121.classifier = nn.Sequential(...)