多层感知器不工作

时间:2017-09-18 14:45:36

标签: python neural-network perceptron

我正在尝试使用简单的numpy实现多层感知器。但是,我正在遇到一个障碍。我没有像过去那样正确地实现它,我总是使用库来实现这个目的。我非常感谢在调试代码时提供一些帮助。它不到100行代码,希望不要占用太多时间。谢谢!

我的感知器的细节如下:

  1. 输入= 2
  2. 输出= 1
  3. 隐藏层数= 5
  4. 损失=平方错误
  5. 以下是我的代码:

    (我在所有看似必要的地方评论过)

    import numpy as np
    import matplotlib.pyplot as plt
    
    #Sampling 100 random values uniformly distributed b/w 0 and 5.
    x=np.random.uniform(low=0, high=5, size=(100,))
    y=np.multiply(x,x)
    #Storing the random values and their squares in x and y
    x=np.reshape(x,(-1,1))
    y=np.reshape(y,(-1,1))
    # plt.plot(x,y, 'ro')
    # plt.show()
    
    #Network Initialisation
    hSize=5
    inputSize=1
    outputSize=1
    Wxh=np.random.rand(hSize, inputSize+1)
    Woh=np.random.rand(outputSize, hSize+1)
    
    
    #+++++++++++++Back-propagation++++++++++++++
    iterations=100
    WohGrad=np.zeros(Woh.shape)
    WxhGrad=np.zeros(Wxh.shape)
    for i in range(0, iterations):
        #+++++++++++++Forward Pass++++++++++++++
        #Input Layer
        z1=x[i]
        a1=z1
        h1=np.append([1], a1)
        #Hidden Layer-1
        z2=np.dot(Wxh, h1)
        a2=1/(1+np.exp(-z2))
        h2=np.append([1], a2)
        #Output Layer
        z3=np.dot(Woh, h2)
        a3=z3
    
    
        #+++++++++++++Backward Pass++++++++++++++
        #Squared Error
        pred=a3
        expected=y[i]
        loss=np.square(pred-expected)/2
    
        #Delta Values
        delta_3=(pred-expected)
        delta_2=np.multiply(np.dot(np.transpose(Woh), delta_3)[1:], 1/(1+np.exp(-z2) ))
    
        #Parameter Gradients and Update
        WohGrad=WohGrad+np.dot(delta_3,(h2.reshape(1,-1)))
        WxhGrad=WxhGrad+np.dot(delta_2.reshape(hSize,-1),(h1.reshape(1,-1)))
    
    #Parameter Update
    learningRate=0.01
    L2_regularizer=0.01
    WohGrad=WohGrad/iterations+L2_regularizer*Woh
    WxhGrad=WxhGrad/iterations+L2_regularizer*Wxh
    Wxh=Wxh-learningRate*WxhGrad
    Woh=Woh-learningRate*WohGrad
    
    
    #++++++++Testing++++++++++
    #Forward Pass
    #Input Layer
    z1=np.array([2.5])
    a1=z1
    h1=np.append([1], a1)
    
    
    #Hidden Layer-1
    z2=np.dot(Wxh, h1)
    a2=1/(1+np.exp(-z2))
    h2=np.append([1], a2)
    #Output Layer
    z3=np.dot(Woh, h2)
    a3=z3
    print(a3)
    

0 个答案:

没有答案