张量流中致密层的输出尺寸

时间:2018-10-11 15:32:05

标签: python-2.7 tensorflow deep-learning

我正在使用Tensorflow编写Cifar10实现的Alexnet。以下是最后几层的代码

   ## pool5
    pool5 = tf.nn.max_pool(conv5,
                         ksize=[1, 3, 3, 1],
                         strides=[1, 2, 2, 1],
                         padding="SAME",
                         name='pool5')
    print_activations(pool5)

    # ## Flatten
    # pool5=tf.contrib.layers.flatten(pool5, outputs_collections=None, scope=None)

    ## FC1
    fc1=tf.layers.dense(pool5, 4096, activation=tf.nn.relu,  trainable=True)
    print_activations(fc1)

    ## FC2
    fc2=tf.layers.dense(fc1, 4096, activation=tf.nn.relu,  trainable=True)
    print_activations(fc2)

    ## Ouput
    out1=tf.layers.dense(fc2, 10, activation=None,  trainable=True)

据我了解,最后一层的输出应该是长度为10的张量。考虑到批处理大小,它应该是[None,10]。但是,当我返回'out1.shape.dims'时,我得到了[None,None,None,10]。有人知道我在想什么吗?谢谢

0 个答案:

没有答案