我正在为pytorch框架上的多类分割任务训练一个基于unet的模型。使用以下损失函数优化模型,
class MulticlassJaccardLoss(_Loss):
"""Implementation of Jaccard loss for multiclass (semantic) image segmentation task
"""
__name__ = 'mc_jaccard_loss'
def __init__(self, classes: List[int] = None, from_logits=True, weight=None, reduction='elementwise_mean'):
super(MulticlassJaccardLoss, self).__init__(reduction=reduction)
self.classes = classes
self.from_logits = from_logits
self.weight = weight
def forward(self, y_pred: Tensor, y_true: Tensor) -> Tensor:
"""
:param y_pred: NxCxHxW
:param y_true: NxHxW
:return: scalar
"""
if self.from_logits:
y_pred = y_pred.softmax(dim=1)
n_classes = y_pred.size(1)
smooth = 1e-3
if self.classes is None:
classes = range(n_classes)
else:
classes = self.classes
n_classes = len(classes)
loss = torch.zeros(n_classes, dtype=torch.float, device=y_pred.device)
if self.weight is None:
weights = [1] * n_classes
else:
weights = self.weight
for class_index, weight in zip(classes, weights):
jaccard_target = (y_true == class_index).float()
jaccard_output = y_pred[:, class_index, ...]
num_preds = jaccard_target.long().sum()
if num_preds == 0:
loss[class_index-1] = 0 #custom
else:
iou = soft_jaccard_score(jaccard_output, jaccard_target, from_logits=False, smooth=smooth)
loss[class_index-1] = (1.0 - iou) * weight #custom
if self.reduction == 'elementwise_mean':
return loss.mean()
if self.reduction == 'sum':
return loss.sum()
return loss
我仅计算两个类的损失(第1类和第2类,而不是背景类)。
MulticlassJaccardLoss(weight=[0.5,10], classes=[1,2], from_logits=False)
当我训练模型时,它会训练前几次迭代,并且出现以下错误,
element 0 of tensors does not require grad and does not have a grad_fn
代码中的错误是什么?
谢谢!
答案 0 :(得分:1)
尝试设置:
torch.zeros(..., requires_grad=True)
我相信torch.zeros的默认设置是require_grad = False,所以这可能会有所帮助。