试图实现反向传播

时间:2017-12-22 23:17:16

标签: python machine-learning neural-network backpropagation gradient-descent

我是机器学习的新手,我正在尝试用反向传播实现梯度下降。我已经得到了以下代码。我不知道为什么它不起作用......

def cost(params,X,y,alpha,lmda):

# first unroll the params
w1=params[0]
b1=1.0   # bias nodes are always 1
w1b=params[1]

w2=params[2]
b2=1.0   # bias nodes are always 1
w2b=params[3]

m=np.double(x.shape[0])

# feed forward
a1=X
z2=np.dot(a1,w1)+b1*w1b
a2=g(z2)
z3=np.dot(a2,w2)+b2*w2b
a3=g(z3)

E=sum(sum((y-a3)**2))/m

# Back propagation
dE=2*(y-a3)

dz3= np.multiply(dE,gPrime(z3))
dw2=sum(np.dot(a2.T,dz3))/m
dw2b=sum(dz3)/m

dz2= np.multiply(np.dot(dz3,w2.T),gPrime(z2))
dw1=sum(np.dot(a1.T,dz2))/m
dw1b=sum(dz2)/m

return E,[w1-alpha*dw1,w1b-alpha*dw1b,w2-alpha*dw2,w2b-alpha*dw2b]

这是我的呼叫功能

def train(p,X,y,alpha,lmda,iteration):
E=[]
for i in range(iteration):
    print i
    [e,p]=cost(p,X,y,alpha,lmda)
    E.append(e)
print 'cost plot'
plot.plot(E)
plot.show()
return p

这是绘制成本时得到的曲线...... enter image description here

0 个答案:

没有答案