Batchnorm中错误的Backprop更新

时间:2018-05-02 13:44:37

标签: machine-learning neural-network backpropagation batch-normalization

我有Backprop的这些更新,请告诉我dx部分的错误。在计算图中,我使用的是X,sample_mean和sample_var。谢谢你的帮助

(x, norm, sample_mean, sample_var, gamma, eps) = cache
dbeta = np.sum(dout, axis = 0)
dgamma = np.sum(dout * norm, axis = 0)
dxminus = dout * gamma / np.sqrt(sample_var + eps)
dmean = - np.sum(dxminus, axis = 0)
dxmean = np.full(x.shape, 1.0/x.shape[0]) * dmean
dvar = np.sum(dout * gamma * (x - sample_mean), axis = 0)
dxvar = dvar * (-1 / x.shape[0]) * np.power(x, -1.5) * (x - sample_mean)
dx = dxminus + dxmean + dxvar

BatchNorm Computational Graph I used for deriving

1 个答案:

答案 0 :(得分:0)

您的dx公式看起来不正确,因为x节点将接收来自其他两个节点的后向消息(一个是总和,另一个是平均值),以及看起来你只计算一个组件:

backprop

所以看起来应该是这样的:

dx1 = dxmu1 + dxmu2
dmu = -1 * np.sum(dxmu1+dxmu2, axis=0)
dx2 = 1. /N * np.ones((N,D)) * dmu
dx = dx1 + dx2

图片来自this wonderful post。您也可以在那里找到完整的代码。