我试图手动为我的pytorch模型分配新的权重。我可以像这样指定新的权重:
import scipy.io as sio
import torch
caffe_params = sio.loadmat('export_conv1_1.mat')
net.conv1_1.weight = torch.nn.Parameter(torch.from_numpy(caffe_params['w']))
net.conv1_1.bias = torch.nn.Parameter(torch.from_numpy(caffe_params['b']))
caffe_params = sio.loadmat('export_conv2_1.mat')
net.conv2_1.weight = torch.nn.Parameter(torch.from_numpy(caffe_params['w']))
net.conv2_1.bias = torch.nn.Parameter(torch.from_numpy(caffe_params['b']))
由于我有很多图层,因此我不想通过它的名称手动分配每一层。相反,我希望循环遍历图层名称列表并自动分配它们。像这样:
varList = ['conv2_1','conv2_2']
for name in varList:
caffe_params = sio.loadmat(rootDir + 'export_' + name + '.mat')
setattr(net, name + '.weight' ,torch.nn.Parameter(torch.from_numpy(caffe_params['w'])))
setattr(net, name + '.bias' ,torch.nn.Parameter(torch.from_numpy(caffe_params['b'])))
不幸的是,这不起作用。我猜setattr
不能使用pytorch weigths或类型为layername.weight
的属性,这意味着要分配的属性相对于net
具有深度2。
答案 0 :(得分:1)
以下是否有效?
for name in varList:
caffe_params = sio.loadmat(rootDir + 'export_' + name + '.mat')
getattr(net, name).weight.data.copy_(torch.from_numpy(caffe_params['w']))
getattr(net, name).bias.data.copy_(torch.from_numpy(caffe_params['b']))