恢复后的张量流批量标准化

时间:2018-07-02 10:43:07

标签: python-3.x tensorflow

让我们说我们创建了一个小型网络:

tf.reset_default_graph()
layers      = [5, 3, 1]
activations = [tf.tanh, tf.tanh,  None]

inp = tf.placeholder(dtype=tf.float32, shape=(None, 2 ), name='inp')
out = tf.placeholder(dtype=tf.float32, shape=(None, 1 ), name='out')

isTraining = tf.placeholder(dtype=tf.bool, shape=(), name='isTraining')

N = inp * 1 # I am lazy
for i, (l, a) in enumerate(zip(layers, activations)):
    N = tf.layers.dense(N, l, None)
    #N = tf.layers.batch_normalization( N,  training = isTraining) # comment this line
    if a is not None:
        N = a(N)

err = tf.reduce_mean((N - out)**2)
update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
with tf.control_dependencies(update_ops):
    opt = tf.train.AdamOptimizer(0.05).minimize(err)

# insert vectors from the batch normalization
tVars = tf.trainable_variables()
graph = tf.get_default_graph()
for v in graph.get_collection(tf.GraphKeys.GLOBAL_VARIABLES): 
    if all([
            ('batch_normalization' in v.name),
            ('optimizer' not in v.name), 
            v not in tVars ]):
        tVars.append(v)

init = tf.global_variables_initializer()
saver = tf.train.Saver(var_list= tVars)

这是为优化而生成的简单NN。我当前唯一感兴趣的是批处理优化(已被注释掉的行)。现在,我们训练该网络,保存它,还原它,然后再次计算错误,我们可以:

# Generate random data
N = 1000
X = np.random.rand(N, 2)
y = 2*X[:, 0] + 3*X[:, 1] + 3 
y = y.reshape(-1, 1)

# Run the session and save it
with tf.Session() as sess:
    sess.run(init)
    print('During Training')
    for i in range(3000):
        _, errVal = sess.run([opt, err], feed_dict={inp:X, out:y, isTraining:True})
        if i %500 == 0:
            print(errVal)

    shutil.rmtree('models1', ignore_errors=True)
    os.makedirs('models1')
    path = saver.save( sess, 'models1/model.ckpt' )

# restore the session
print('During testing')
with tf.Session() as sess:
    saver.restore(sess, path)
    errVal = sess.run(err, feed_dict={inp:X, out:y, isTraining:False})
    print( errVal )

以下是输出:

During Training
24.4422
0.00330666
0.000314223
0.000106421
6.00441e-05
4.95262e-05
During testing
INFO:tensorflow:Restoring parameters from models1/model.ckpt
5.5899e-05 

另一方面,当我们取消批处理规范化行的注释并重做以上计算时:

During Training
31.7372
1.92066e-05
3.87879e-06
2.55274e-06
1.25418e-06
1.43078e-06
During testing
INFO:tensorflow:Restoring parameters from models1/model.ckpt
0.041519

如您所见,恢复的值与模型预测的值相去甚远。我有做错什么吗?

注意:我知道要进行批次标准化,我需要生成小型批次。为了使代码简单而完整,我已经跳过了所有这些操作。

1 个答案:

答案 0 :(得分:0)

Tensorflow中定义的批标准化层需要访问占位符isTraininghttps://www.tensorflow.org/api_docs/python/tf/layers/batch_normalization)。定义图层时,请确保将其包括在内:tf.layers.batch_normalization(..., training=isTraining, ...)

这样做的原因是批处理归一化层具有2个可训练参数(beta和gamma),这些参数在网络的其余部分正常进行训练,但是它们还具有2个额外参数(批平均值和方差),需要您告诉他们训练。您只需按照上面的食谱进行操作即可。

现在,您的代码似乎不是训练均值和方差。相反,它们是随机固定的,因此可以优化网络。稍后,当您保存和还原时,它们会用不同的值重新初始化,因此网络无法像以前那样运行。