PyTorch Softmax Dimensions错误

时间:2018-01-03 02:46:17

标签: python machine-learning neural-network pytorch

我尝试编写一个简单的NN模块,有2层,第一层ReLU激活,输出softmax有3个类(一个热编码)。我使用softmax功能的方式似乎有些不对劲,但我不确定是怎么回事。

X是178x13 Y是178x3

我使用的数据集相当简单,可以找到here

我一直收到错误:

RuntimeError: dimension out of range (expected to be in range of [-2, 1], but got 3) . 

import pandas as pd
import numpy as np
import torch
from torch.autograd import Variable
from sklearn.preprocessing import LabelBinarizer

# Read in dataset, specifying that this set didn't come with column headers
x = pd.read_csv('Datasets/wine.data', header=None)

# Rename columns
x.columns = ['Class', 'A1', 'A2', 'A3', 'A4', 'A5', 'A6', 'A7', 'A8', 'A9', 'A10', 'A11', 'A12', 'A13']

y = x[['Class']].values

#turn class labels into one-hot encoding
one_hot = LabelBinarizer()
y = Variable(torch.from_numpy(one_hot.fit_transform(y)), )

x = Variable(torch.from_numpy(x.iloc[:, 1:14].values).float())


N, D_in, H, D_out = y.shape[0], x.shape[1], 20, 3

# Implement neural net with nn module

model = torch.nn.Sequential(
    torch.nn.Linear(D_in, H),
    torch.nn.ReLU(),
    torch.nn.Linear(H, D_out),
    torch.nn.LogSoftmax(dim=3)
)

loss_fn = torch.nn.NLLLoss

learning_rate = 1e-4
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

for t in range(500):
    y_pred = model(x)

    loss = loss_fn(y_pred, y)
    print("Iteration: %d | Loss: %.3f" % (t, loss))

    optimizer.zero_grad()

    loss.backward()

    optimizer.step()

2 个答案:

答案 0 :(得分:1)

在我看来,你误解了dim的论据LogSoftmax。从文档中,

  

dim(int) - 计算Softmax的维度(所以每个   沿昏暗的切片将总和为1)。

现在,在您通过两个线性图层传递输入后,您获得的张量以及您应用的张量LogSoftmax的尺寸为178 x 3。很明显,dim = 3不可用,因为你的张量只有两个维度。相反,请尝试dim=1对各列进行求和。

答案 1 :(得分:1)

这是一个问题,因为对于NLLLoss:

The target that this loss expects is a class index (0 to N-1, where N = number of classes)

我一直试图给它一个热门的编码矢量。我通过这样做解决了我的问题:

loss = loss_fn(y_pred, torch.max(y, 1)[1])

凡torch.max找到最大值及其各自的指数。