'DataParallel'对象没有属性'init_hidden'

时间:2018-05-21 04:26:48

标签: deep-learning pytorch

我想要做的是在我的自定义RNN类中使用DataParallel。

似乎我以错误的方式初始化hidden_​​0 ......

class RNN(nn.Module):
    def __init__(self, input_size, hidden_size, output_size, n_layers=1):
    super(RNN, self).__init__()
    self.input_size = input_size
    self.hidden_size = hidden_size
    self.output_size = output_size
    self.n_layers = n_layers

    self.encoder = nn.Embedding(input_size, hidden_size)
    self.gru = nn.GRU(hidden_size, hidden_size, n_layers,batch_first = True)
    self.decoder = nn.Linear(hidden_size, output_size)
    self.init_hidden(batch_size)


def forward(self, input):
    input = self.encoder(input)
    output, self.hidden = self.gru(input,self.hidden)
    output = self.decoder(output.contiguous().view(-1,self.hidden_size))
    output = output.contiguous().view(batch_size,num_steps,N_CHARACTERS)
    #print (output.size())10,50,67

    return output

def init_hidden(self,batch_size):
    self.hidden = Variable(T.zeros(self.n_layers, batch_size, self.hidden_size).cuda())

我以这种方式打电话给网络:

decoder = T.nn.DataParallel(RNN(N_CHARACTERS, HIDDEN_SIZE, N_CHARACTERS), dim=1).cuda()

然后开始训练:

for epoch in range(EPOCH_):
    hidden = decoder.init_hidden()

但是我得到了错误,我没有理解如何解决它......

  

'数据并行'对象没有属性' init_hidden'

感谢您的帮助!

2 个答案:

答案 0 :(得分:6)

使用DataParallel时,原始模块将位于并行模块的属性module中:

for epoch in range(EPOCH_):
    hidden = decoder.module.init_hidden()

答案 1 :(得分:0)

我解决的方法是:

self.model = model 
# Since if the model is wrapped by the `DataParallel` class, you won't be able to access its attributes
# unless you write `model.module` which breaks the code compatibility. We use `model_attr_accessor` for attributes
# accessing only.
if isinstance(model, DataParallel):
    self.model_attr_accessor = model.module
else:
    self.model_attr_accessor = model

这给了我一个优势,那就是在执行self.model(input)时(即,当模型被DataParallel包裹时)可以在GPU上分配模型;当我需要访问其属性时,只需执行self.model_attr_accessor.<<WHATEVER>>。同样,这种设计为我提供了一种更加模块化的方式,可以从多个函数访问属性,而无需在所有函数中都使用if-statements来检查其是否被DataParallel包装。

另一方面,如果您编写了model.module.<<WHATEVER>>并且模型没有被DataParallel包裹,则会引发错误,指出您的模型没有module属性。


但是,更紧凑的实现是像这样创建自定义的DataParallel

class _CustomDataParallel(nn.Module):
    def __init__(self, model):
        super(_CustomDataParallel, self).__init__()
        self.model = nn.DataParallel(model).cuda()
        print(type(self.model))

    def forward(self, *input):
        return self.model(*input)

    def __getattr__(self, name):
        try:
            return super().__getattr__(name)
        except AttributeError:
            return getattr(self.model.module, name)