深度神经网络中的梯度检查问题

时间:2019-02-20 15:18:34

标签: neural-network deep-learning gradient-descent

我目前正在为深度神经网络编写代码。我已经实现了前倾和后仰支撑。为了检查我的反向传播是否做得好,我实施了梯度检查。即使权重和偏差的惯性化是随机的,梯度的近似值与在反向传播中获得的梯度之间的差异也始终很大,但始终在相同的数字附近。 它总是给我约0.6的数字。对什么地方有任何想法吗?

代码:

def grad_check (gradients, parameters,X,Y,activation_functions, layers_dims,pRelu ,epsilon=1e-7):

theta,positions = dic_to_vector(parameters)
grads_vector,_ = dic_to_vector(gradients,False)
nparams = len (theta)
n_att = X.shape[0]
gradapprox = np.zeros((nparams,1))

for i in range(0,nparams):
    thetap = np.array(theta)
    thetap[i]=thetap[i]+epsilon
    thetam = np.array(theta)
    thetam[i]=thetam[i]-epsilon

    ALp,_= forward_prop(X,vector_to_dic(thetap,positions,layers_dims,n_att),activation_functions,pRelu)
    ALm,_ = forward_prop(X,vector_to_dic(thetam,positions,layers_dims,n_att),activation_functions,pRelu)

    Jp = compute_cost(ALp,Y)
    Jm = compute_cost(ALm,Y)
    derapprox = (Jp-Jm)/(2*epsilon)

    gradapprox[i]=derapprox

numerator = np.linalg.norm(grads_vector-gradapprox)                                        # Step 1'
denominator = np.linalg.norm(grads_vector)+np.linalg.norm(gradapprox)                                         # Step 2'
difference = numerator/denominator 

if difference > 2e-7:
    print ("\033[93m" + "There is a mistake in the backward propagation! difference = " + str(difference) + "\033[0m")
else:
    print ("\033[92m" + "Your backward propagation works perfectly fine! difference = " + str(difference) + "\033[0m")

return difference

0 个答案:

没有答案
相关问题