Tensorflow卷积层具有奇怪的伪像

时间:2018-01-10 19:23:55

标签: tensorflow tensorflow-layers

当我使用tf.layers.conv1d时,有人可以向我解释我的错误,即我的张量板图表还有其他组吗?

为了简单起见,我创建了一个tf.name_scope' conv_block1'它包含:conv1d -> max_pool -> batch_norm,但我的图表有奇怪的附加块(参见随附的屏幕截图)。基本上是一个表面块&{39} conv1d was added with weights for the conv_block1 / conv1d`图层,它被放置一个组。这使得具有多个卷积块的网络完全不可读,我做错了什么,或者这是Tensorflow 1.4中的某种错误/性能特征?奇怪的是,致密层很细,权重适当。

Graph naming issue described above

以下是有人想要重新创建图表的代码:

def cnn_model(inputs, mode):
  x = tf.placeholder_with_default(inputs['wav'], shape=[None, SAMPLE_RATE, 1],  name='input_placeholder')

  with tf.name_scope("conv_block1"):
    x = tf.layers.conv1d(x, filters=80, kernel_size=5, strides=1, padding='same', activation=tf.nn.relu)
    x = tf.layers.max_pooling1d(x, pool_size=3, strides=3)
    x = tf.layers.batch_normalization(x, training=(mode == tf.estimator.ModeKeys.TRAIN))

  x = tf.layers.flatten(x)
  x = tf.layers.dense(x, units=12)
  return x

更新1

我添加了更简单的示例,可以直接执行以查看问题:

g = tf.Graph()
with g.as_default():
  x = tf.placeholder(name='input', dtype=tf.float32, shape=[None, 16000, 1])
  with tf.name_scope('group1'):
    x = tf.layers.conv1d(x, 80, 5, name='conv1')
  x = tf.layers.dense(x, 10, name="dense1")
[n.name for n in g.as_graph_def().node]

输出:

['input',
 'conv1/kernel/Initializer/random_uniform/shape',
 'conv1/kernel/Initializer/random_uniform/min',
 'conv1/kernel/Initializer/random_uniform/max',
 'conv1/kernel/Initializer/random_uniform/RandomUniform',
 'conv1/kernel/Initializer/random_uniform/sub',
 'conv1/kernel/Initializer/random_uniform/mul',
 'conv1/kernel/Initializer/random_uniform',
 'conv1/kernel',
 'conv1/kernel/Assign',
 'conv1/kernel/read',
 'conv1/bias/Initializer/zeros',
 'conv1/bias',
 'conv1/bias/Assign',
 'conv1/bias/read',
 'group1/conv1/dilation_rate',
 'group1/conv1/conv1d/ExpandDims/dim',
 'group1/conv1/conv1d/ExpandDims',
 'group1/conv1/conv1d/ExpandDims_1/dim',
 'group1/conv1/conv1d/ExpandDims_1',
 'group1/conv1/conv1d/Conv2D',
 'group1/conv1/conv1d/Squeeze',
 'group1/conv1/BiasAdd',
 'dense1/kernel/Initializer/random_uniform/shape',
 'dense1/kernel/Initializer/random_uniform/min',
 'dense1/kernel/Initializer/random_uniform/max',
 'dense1/kernel/Initializer/random_uniform/RandomUniform',
 'dense1/kernel/Initializer/random_uniform/sub',
 'dense1/kernel/Initializer/random_uniform/mul',
 'dense1/kernel/Initializer/random_uniform',
 'dense1/kernel',
 'dense1/kernel/Assign',
 'dense1/kernel/read',
 'dense1/bias/Initializer/zeros',
 'dense1/bias',
 'dense1/bias/Assign',
 'dense1/bias/read',
 'dense1/Tensordot/Shape',
 'dense1/Tensordot/Rank',
 'dense1/Tensordot/axes',
 'dense1/Tensordot/GreaterEqual/y',
 'dense1/Tensordot/GreaterEqual',
 'dense1/Tensordot/Cast',
 'dense1/Tensordot/mul',
 'dense1/Tensordot/Less/y',
 'dense1/Tensordot/Less',
 'dense1/Tensordot/Cast_1',
 'dense1/Tensordot/add',
 'dense1/Tensordot/mul_1',
 'dense1/Tensordot/add_1',
 'dense1/Tensordot/range/start',
 'dense1/Tensordot/range/delta',
 'dense1/Tensordot/range',
 'dense1/Tensordot/ListDiff',
 'dense1/Tensordot/Gather',
 'dense1/Tensordot/Gather_1',
 'dense1/Tensordot/Const',
 'dense1/Tensordot/Prod',
 'dense1/Tensordot/Const_1',
 'dense1/Tensordot/Prod_1',
 'dense1/Tensordot/concat/axis',
 'dense1/Tensordot/concat',
 'dense1/Tensordot/concat_1/axis',
 'dense1/Tensordot/concat_1',
 'dense1/Tensordot/stack',
 'dense1/Tensordot/transpose',
 'dense1/Tensordot/Reshape',
 'dense1/Tensordot/transpose_1/perm',
 'dense1/Tensordot/transpose_1',
 'dense1/Tensordot/Reshape_1/shape',
 'dense1/Tensordot/Reshape_1',
 'dense1/Tensordot/MatMul',
 'dense1/Tensordot/Const_2',
 'dense1/Tensordot/concat_2/axis',
 'dense1/Tensordot/concat_2',
 'dense1/Tensordot',
 'dense1/BiasAdd']

1 个答案:

答案 0 :(得分:1)

好的,我发现问题显然tf.name_scope仅供操作,tf.variable_scope适用于操作和变量(as per this tf issue)。

这是一个堆栈溢出问题,解释了name_scope和variable_scope之间的区别: What's the difference of name scope and a variable scope in tensorflow?

g = tf.Graph()
with g.as_default():
  x = tf.placeholder(name='input', dtype=tf.float32, shape=[None, 16000, 1])
  with tf.variable_scope('v_scope1'):
    x = tf.layers.conv1d(x, 80, 5, name='conv1')
[n.name for n in g.as_graph_def().node]

给出:

['input',
 'v_scope1/conv1/kernel/Initializer/random_uniform/shape',
 'v_scope1/conv1/kernel/Initializer/random_uniform/min',
 'v_scope1/conv1/kernel/Initializer/random_uniform/max',
 'v_scope1/conv1/kernel/Initializer/random_uniform/RandomUniform',
 'v_scope1/conv1/kernel/Initializer/random_uniform/sub',
 'v_scope1/conv1/kernel/Initializer/random_uniform/mul',
 'v_scope1/conv1/kernel/Initializer/random_uniform',
 'v_scope1/conv1/kernel',
 'v_scope1/conv1/kernel/Assign',
 'v_scope1/conv1/kernel/read',
 'v_scope1/conv1/bias/Initializer/zeros',
 'v_scope1/conv1/bias',
 'v_scope1/conv1/bias/Assign',
 'v_scope1/conv1/bias/read',
 'v_scope1/conv1/dilation_rate',
 'v_scope1/conv1/conv1d/ExpandDims/dim',
 'v_scope1/conv1/conv1d/ExpandDims',
 'v_scope1/conv1/conv1d/ExpandDims_1/dim',
 'v_scope1/conv1/conv1d/ExpandDims_1',
 'v_scope1/conv1/conv1d/Conv2D',
 'v_scope1/conv1/conv1d/Squeeze',
 'v_scope1/conv1/BiasAdd']