带有require_grad参数

时间:2018-11-08 05:15:54

标签: python pytorch tensor

我不能将torch.Tensor()与requires_grad参数一起使用(火炬版本:0.4.1)

没有require_grad:

x = torch.Tensor([[.5, .3, 2.1]])
print(x)
> tensor([[0.5000, 0.3000, 2.1000]])

with require_grad = True或require_grad = False:

x = torch.Tensor([[.5, .3, 2.1]], requires_grad=False)
print(x)
Traceback (most recent call last):
  File "D:/_P/dev/ai/pytorch/notes/tensor01.py", line 4, in <module>
    x = torch.Tensor([[.5, .3, 2.1]], requires_grad=False)
TypeError: new() received an invalid combination of arguments - got (list, requires_grad=bool), but expected one of:
 * (torch.device device)
 * (torch.Storage storage)
 * (Tensor other)
 * (tuple of ints size, torch.device device)
      didn't match because some of the keywords were incorrect: requires_grad
 * (object data, torch.device device)
      didn't match because some of the keywords were incorrect: requires_grad

1 个答案:

答案 0 :(得分:1)

您正在使用不带有x标志的torch.Tensor类构造函数来创建张量requires_grad。相反,您想使用torch.tensor()(小写的“ t”)方法

x = torch.tensor([[.5, .3, 2.1]], requires_grad=False)

编辑:添加指向文档的链接:torch.Tensor