在循环内和外部pytorch中声明变量张量有什么区别?

时间:2019-02-19 21:45:03

标签: python-3.x pytorch tensor

在训练gan的其中一个代码中,我看到要在循环中声明真实标签和假标签的张量,如下所示:

Tensor = torch.cuda.FloatTensor if cuda else torch.FloatTensor

# ----------
#  Training
# ----------

for epoch in range(epochs):
    for i, (imgs, _) in enumerate(dataloader):

        # Adversarial ground truths
        valid = Variable(Tensor(imgs.size(0), 1).fill_(1.0), requires_grad=False)
        fake = Variable(Tensor(imgs.size(0), 1).fill_(0.0), requires_grad=False)

但是为了方便起见,我声明退出循环:

real_labels = torch.ones((50,1) , dtype = torch.float , requires_grad = False)
fake_labels = torch.zeros((50,1) , dtype = torch.float , requires_grad = False)

real_labels = real_labels.to(device);
fake_labels = fake_labels.to(device);

for e in range(epochs):

for i,(img,_) in enumerate(train_loader):

    img = img.to(device)

    z = (torch.tensor(np.random.normal(0, 1,(50,100)) , dtype = torch.float , device = device ))

    gen_imgs = gen(z)

其他明智的休息几乎是一样的。在火炬中会影响我的训练损失吗? 我问,因为我生成的图像没有出现

0 个答案:

没有答案