烤宽面条 - 错误

时间:2015-11-14 10:09:41

标签: python theano lasagne

我正在尝试使用Robert Layton的Python学习数据挖掘中的代码来学习神经网络/烤宽面条。我想我正在遵循这封信的代码,但我收到以下错误消息。我所做的任何暗示或直觉都非常感谢;

Traceback (most recent call last):

  File "<ipython-input-78-3ff2950373de>", line 3, in <module>
    updates=lasagne.updates.sgd(loss,all_params,learning_rate=0.01)

  File "C:\Users\WouterD\Anaconda\lib\site-packages\lasagne\updates.py", line 134, in sgd
    grads = get_or_compute_grads(loss_or_grads, params)

  File "C:\Users\WouterD\Anaconda\lib\site-packages\lasagne\updates.py", line 110, in get_or_compute_grads
    return theano.grad(loss_or_grads, params)

  File "C:\Users\WouterD\Anaconda\lib\site-packages\theano-0.7.0-py2.7.egg\theano\gradient.py", line 551, in grad
    handle_disconnected(elem)

  File "C:\Users\WouterD\Anaconda\lib\site-packages\theano-0.7.0-py2.7.egg\theano\gradient.py", line 538, in handle_disconnected
    raise DisconnectedInputError(message)

DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: W
Backtrace when the node is created:
  File "C:\Users\WouterD\Anaconda\lib\site-packages\theano-0.7.0-py2.7.egg\theano\compile\sharedvalue.py", line 248, in shared
    utils.add_tag_trace(var)

以下代码:

from sklearn.datasets import load_iris
iris=load_iris()
X=iris.data.astype(np.float32)
y_true=iris.data.astype(np.int32)

from sklearn.cross_validation import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X,y_true,random_state=14)

import lasagne
input_layer=lasagne.layers.InputLayer(shape=(10,X.shape[1]))

hidden_layer=lasagne.layers.DenseLayer(input_layer,num_units=12,nonlinearity=lasagne.nonlinearities.sigmoid)

output_layer=lasagne.layers.DenseLayer(hidden_layer,num_units=3,nonlinearity=lasagne.nonlinearities.softmax)

import theano.tensor as T
net_input=T.matrix('net_input')
net_output=output_layer.get_output_for(net_input)
true_output=T.ivector("true_output")

loss=T.mean(T.nnet.categorical_crossentropy(net_output,true_output))
all_params=lasagne.layers.get_all_params(output_layer)
updates=lasagne.updates.sgd(loss,all_params,learning_rate=0.01)

2 个答案:

答案 0 :(得分:2)

问题是你没有计算实际输入变量的损失。 net_input=T.matrix('net_input')是您自己的网络符号输入,但是当您创建InputLayer时,Lasagne已经为您创建了一个。您也不需要获取与特定输入相关的输出,只需获得与输入层相关的输出。

所以,替换两行

net_input=T.matrix('net_input')
net_output=output_layer.get_output_for(net_input)

单行

net_output=lasagne.layers.get_output(output_layer)

预测您将要遇到的下一个问题,您可以通过input_layer.input_var获取为您创建的输入变量Lasagne,这样您就可以编译您的训练函数:

import theano
f = theano.function([input_layer.input_var, true_output], outputs=loss, updates=updates)

答案 1 :(得分:-1)

input_layer=lasagne.layers.InputLayer(shape=(10,X.shape[1]),input_var=input)

虽然输入是你在

之前定义的张量