张量流/模型中的inception_v1.py和inception_v2.py有什么不同?

时间:2017-03-11 03:58:28

标签: tensorflow

我们知道在初始v2文件(Batch Normalization)中,它会在卷积层之前添加批量标准化以减少内部协变量偏移,并删除本地响应标准化。但是当我在学习inception_v1.pyinception_v2.py时,我认为这两个模型的代码几乎相同......在inception_2.py中,我找不到批量标准化。例如: inception_v1.py:

end_point = 'Mixed_3b'
with tf.variable_scope(end_point):
    with tf.variable_scope('Branch_0'):
        branch_0 = slim.conv2d(net, 64, [1, 1], scope='Conv2d_0a_1x1')
    with tf.variable_scope('Branch_1'):
        branch_1 = slim.conv2d(net, 96, [1, 1], scope='Conv2d_0a_1x1')
        branch_1 = slim.conv2d(branch_1, 128, [3, 3], scope='Conv2d_0b_3x3')
    with tf.variable_scope('Branch_2'):
        branch_2 = slim.conv2d(net, 16, [1, 1], scope='Conv2d_0a_1x1')
        branch_2 = slim.conv2d(branch_2, 32, [3, 3], scope='Conv2d_0b_3x3')
    with tf.variable_scope('Branch_3'):
        branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')
        branch_3 = slim.conv2d(branch_3, 32, [1, 1], scope='Conv2d_0b_1x1')
    net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])
在inception_v2.py中

end_point = 'Mixed_3b'
with tf.variable_scope(end_point):
    with tf.variable_scope('Branch_0'):
      branch_0 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')
    with tf.variable_scope('Branch_1'):
      branch_1 = slim.conv2d(
          net, depth(64), [1, 1],
          weights_initializer=trunc_normal(0.09),
          scope='Conv2d_0a_1x1')
      branch_1 = slim.conv2d(branch_1, depth(64), [3, 3],
                             scope='Conv2d_0b_3x3')
    with tf.variable_scope('Branch_2'):
      branch_2 = slim.conv2d(
          net, depth(64), [1, 1],
          weights_initializer=trunc_normal(0.09),
          scope='Conv2d_0a_1x1')
      branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],
                             scope='Conv2d_0b_3x3')
      branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],
                             scope='Conv2d_0c_3x3')
    with tf.variable_scope('Branch_3'):
      branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')
      branch_3 = slim.conv2d(
          branch_3, depth(32), [1, 1],
          weights_initializer=trunc_normal(0.1),
          scope='Conv2d_0b_1x1')
    net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])

所以,这是我的问题,inception_v1.py和inception_v2.py之间有什么不同?非常感谢!

1 个答案:

答案 0 :(得分:1)

inception_v1.py实施this论文,而inception_v2.py实施Batch Normalization论文,这正是您所注意到的。