TensorFlow中的批量批量标准化

时间:2017-10-28 11:34:44

标签: machine-learning tensorflow deep-learning conv-neural-network

在TensorFlow中执行批量批量标准化的正确方法是什么? (即,我不想计算运行的均值和方差)。我当前的实现基于tf.nn.batch_normalization,其中x是形状为[batch_size, width, height, num_channels]的卷积层的输出。我想明智地执行批量规范。

batch_mean, batch_var = tf.nn.moments(x, axes=[0, 1, 2])
x = tf.nn.batch_normalization(x, batch_mean, batch_var, offset=0, scale=0, variance_epsilon=1e-6)

但是这个实现的结果非常糟糕。与tensorflow.contrib.slim.batch_norm的比较表明,它的表现较差(同样糟糕的训练表现)。

enter image description here

我做错了什么,能解释这种糟糕表现的原因是什么?

2 个答案:

答案 0 :(得分:1)

您可以考虑tf.contrib.layers.layer_norm。您可能需要将x重新设置为[批次,通道,宽度,高度]并设置begin_norm_axis=2以进行通道标准化(每个批次和每个通道将独立标准化)。

以下是如何从原始订单重塑为[批次,渠道,宽度,高度]的示例:

import tensorflow as tf

sess = tf.InteractiveSession()

batch = 2
height = 2
width = 2
channel = 3

tot_size = batch * height * channel * width

ts_4D_bhwc = tf.reshape(tf.range(tot_size), [batch, height, width, channel])
ts_4D_bchw = tf.transpose(ts_4D_bhwc, perm=[0,3,1,2])

print("Original tensor w/ order bhwc\n")
print(ts_4D_bhwc.eval())

print("\nTransormed tensor w/ order bchw\n")
print(ts_4D_bchw.eval())

输出:

Original tensor w/ order bhwc

[[[[ 0  1  2]
   [ 3  4  5]]

  [[ 6  7  8]
   [ 9 10 11]]]


 [[[12 13 14]
   [15 16 17]]

  [[18 19 20]
   [21 22 23]]]]

Transormed tensor w/ order bchw

[[[[ 0  3]
   [ 6  9]]

  [[ 1  4]
   [ 7 10]]

  [[ 2  5]
   [ 8 11]]]


 [[[12 15]
   [18 21]]

  [[13 16]
   [19 22]]

  [[14 17]
   [20 23]]]]

答案 1 :(得分:0)

@Maosi的解决方案有效,但我发现它很慢。以下内容简单快捷。

batch_mean, batch_var = tf.nn.moments(x, axes=[0, 1, 2])
x = tf.subtract(x, batch_mean)
x = tf.div(x, tf.sqrt(batch_var) + 1e-6)