NeuPy:输入形状问题

时间:2017-11-18 18:58:24

标签: python machine-learning theano neupy

我想用neupy构建一个神经网络。 因此,我构建了以下架构:

 network = layers.join(
                    layers.Input(10),

                    layers.Linear(500),
                    layers.Relu(),

                    layers.Linear(300),
                    layers.Relu(),

                    layers.Linear(10),
                    layers.Softmax(),
                )

我的数据形成如下:

x_train.shape = (32589,10)
y_train.shape = (32589,1)

当我尝试使用以下方式训练此网络时

model.train(x_train, y_trian)

我得到了以下错误:

ValueError: Input dimension mis-match. (input[0].shape[1] = 10, input[1].shape[1] = 1)
Apply node that caused the error: Elemwise{sub,no_inplace}(SoftmaxWithBias.0, algo:network/var:network-output)
Toposort index: 26
Inputs types: [TensorType(float64, matrix), TensorType(float64, matrix)]
Inputs shapes: [(32589, 10), (32589, 1)]
Inputs strides: [(80, 8), (8, 8)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[Elemwise{Composite{((i0 * i1) / i2)}}(TensorConstant{(1, 1) of 2.0}, Elemwise{sub,no_inplace}.0, Elemwise{mul,no_inplace}.0), Elemwise{Sqr}[(0, 0)](Elemwise{sub,no_inplace}.0)]]

如何编辑网络以映射此类数据?

非常感谢你!

1 个答案:

答案 0 :(得分:1)

您的体系结构有10个输出而不是1.我假设您的y_train函数是0-1类标识符。如果是这样,那么您需要将结构更改为:

network = layers.join(
   layers.Input(10),

   layers.Linear(500),
   layers.Relu(),

   layers.Linear(300),
   layers.Relu(),

   layers.Linear(1),  # Single output
   layers.Sigmoid(),  # Sigmoid works better for 2-class classification
)

你可以让它变得更简单

network = layers.join(
   layers.Input(10),
   layers.Relu(500),
   layers.Relu(300),
   layers.Sigmoid(1),
)

它起作用的原因是因为layers.Liner(10) > layers.Relu()layers.Relu(10)相同。您可以在官方文档中了解更多信息:http://neupy.com/docs/layers/basics.html#mutlilayer-perceptron-mlp