Coursera ML Excercise 4(第5周)Python中的神经网络成本函数

时间:2018-07-04 14:52:06

标签: python machine-learning

所以我正在做ex 4,我无法弄清楚ex4。我不想作弊,所以有人可以引导我朝正确的方向前进吗?

def nnCostFunction(nn_params,input_layer_size,hidden_layer_size,num_labels,X, y, lambda_=0.0):
    Theta1 = np.reshape(nn_params[:hidden_layer_size * (input_layer_size + 1)],
                    (hidden_layer_size, (input_layer_size + 1)))

    Theta2 = np.reshape(nn_params[(hidden_layer_size * (input_layer_size + 1)):],
                    (num_labels, (hidden_layer_size + 1)))

    # Setup some useful variables
    m = y.size

    # You need to return the following variables correctly 
    J = 0
    Theta1_grad = np.zeros(Theta1.shape)
    Theta2_grad = np.zeros(Theta2.shape)

    # ====================== YOUR CODE HERE ======================
    x=utils.sigmoid(np.dot(X,Theta1.T))#5000*25
    x_C=np.concatenate([np.ones((m,1)),x],axis=1)
    z=utils.sigmoid(np.dot(x_C,Theta2.T))#5000*10
    J=(1/m)*np.sum(-np.dot(y,np.log(z))-np.dot((1-y),np.log(1-z)))
    # ================================================================
    # Unroll gradients
    # grad = np.concatenate([Theta1_grad.ravel(order=order), Theta2_grad.ravel(order=order)])
    grad = np.concatenate([Theta1_grad.ravel(), Theta2_grad.ravel()])

    return J, grad

lambda_ = 0
J, _ = nnCostFunction(nn_params, input_layer_size, hidden_layer_size,
                   num_labels, X, y, lambda_)
print('Cost at parameters (loaded from ex4weights): %.6f ' % J)
print('The cost should be about                   : 0.287629.')

>> Cost at parameters (loaded from ex4weights): 949.011852  
The cost should be about                   : 0.287629. 

在另一个单元格中,我尝试输出J(不求和),它是:

 array([ 32.94277417,  31.60660549, 121.58989642, 110.33099785,        111.01961993, 105.33746192, 124.60468929, 117.79628872,        102.04080206,  91.74271593]) 

那么,为什么我的费用错了?有人可以引导我。

Here is the main source code for more information

1 个答案:

答案 0 :(得分:0)

仔细研究1.2 Model representation

您已经忘记在第一层单元中添加bias unit。整个backpropagation部分都丢失了。