pytorch减去不同维度的变量

时间:2018-08-06 18:57:59

标签: python reshape pytorch subtraction cross-entropy

conf_loss = cross_entropy_loss(conf_preds.view(-1,num_classes),conf_targets.view(-1))

xy的形状为

X: torch.Size([69856, 40]) ,  Y: torch.Size([69856])

分别。 作者提到的大小为x:[N,D] and y:[N,] 但是my y size is [N]我需要计算差异,但我的内存不足了。任何人都可以在尺寸方面提供帮助吗?将差异取为[N,]后,我应该得到最终形状。我需要计算此print(log_sum_exp - x.gather(1, y.view(-1,1)))

def cross_entropy_loss(x, y):
        '''Cross entropy loss w/o averaging across all samples.
        Args:
          x: (tensor) sized [N,D].
          y: (tensor) sized [N,].
        Return:
          (tensor) cross entroy loss, sized [N,].
        '''
        print("X:",x.shape)
        print("Y:",y.shape)
        xmax = x.data.max()


        log_sum_exp = torch.log(torch.sum(torch.exp(x-xmax), 1)) + xmax
        print(log_sum_exp.shape)   # torch.Size([69856])
        print(x.gather(1,y.view(-1,1)).shape)   #torch.Size([69856, 1])
        #print(log_sum_exp - x.gather(1, y.view(-1,1)))

        #return log_sum_exp - x.gather(1, y.view(-1,1))

0 个答案:

没有答案