我正确使用tf.get_variable()吗?

时间:2017-03-16 11:39:07

标签: tensorflow

我从here读到,建议始终使用tf.get_variable(...),但在我尝试实施网络时,这似乎有点麻烦。

例如:

def create_weights(shape, name = 'weights',\
                  initializer = tf.random_normal_initializer(0, 0.1)):
    weights = tf.get_variable(name, shape, initializer = initializer)
    print("weights created named: {}".format(weights.name))
    return(weights)

def LeNet(in_units, keep_prob):

    # define the network
    with tf.variable_scope("conv1"):
        conv1 = conv(in_units, create_weights([5, 5, 3, 32]), create_bias([32]))
        pool1 = maxpool(conv1)

    with tf.variable_scope("conv2"):
        conv2 = conv(pool1, create_weights([5, 5, 32, 64]), create_bias([64]))
        pool2 = maxpool(conv2)

    # reshape the network to feed it into the fully connected layers
    with tf.variable_scope("flatten"):
        flatten = tf.reshape(pool2, [-1, 1600])
        flatten = dropout(flatten, keep_prob)

    with tf.variable_scope("fc1"):
        fc1 = fc(flatten, create_weights([1600, 120]), biases = create_bias([120]))
        fc1 = dropout(fc1, keep_prob)

    with tf.variable_scope("fc2"):
        fc2 = fc(fc1, create_weights([120, 84]), biases = create_bias([84]))

    with tf.variable_scope("logits"):
        logits = fc(fc2, create_weights([84, 43]), biases = create_bias([43]))

    return(logits) 

我每次拨打with tf_variable_scope(...)时都必须使用create_weights,此外,如果我想将conv1变量的权重更改为[7, 7, 3, 32]而不是[5, 5, 3, 32]我必须重新启动内核,因为变量已经存在。另一方面,如果我使用tf.Variable(...),我就不会遇到任何这些问题。

我是否错误地使用tf.variable_scope(...)

1 个答案:

答案 0 :(得分:0)

您似乎无法更改变量范围中已存在的内容,因此只有在重新启动内核时,才能更改之前定义的变量。(实际上,您创建了一个新变量,因为之前的变量已被删除)

...

这只是我的猜测......如果有人能给出详细的答案,我将不胜感激。