张量流Adamoptimizer beta1_power未初始化

时间:2017-08-04 04:36:58

标签: python tensorflow deep-learning

我正在进行转学习,因此我预先训练了一个网络,将变量(w,b)保存在文件中;关闭了该计划;重新开放另一个项目;恢复所有旧变量;定义了一些新的变量层,对它们进行了初始化;然后开始重新训练。 SGD优化器在我的代码中工作,但如果我将优化器更改为Adam,它会给我以下错误:

2017-08-03 21:28:08.785092: W tensorflow/core/framework/op_kernel.cc:1152] Failed precondition: Attempting to use uninitialized value beta1_power

我的代码:

# Session Start
sess = tf.Session()
# restore pre-trained parameters
saver = tf.train.Saver()
saver.restore(sess, "./pre_train/step1.ckpt")
# init new parameters
weights2 = {
    'fnn_w1': tf.Variable(tf.random_normal([n_hidden_2, n_hidden_1], stddev= sd), name='fnn_w1'),
    'fnn_w2': tf.Variable(tf.random_normal([n_hidden_1, 1], stddev= sd), name='fnn_w2')
}
biases2 = {
    'fnn_b1': tf.Variable(tf.ones([n_hidden_1]), name='fnn_b1'),
    'fnn_b2': tf.Variable(tf.ones([1]), name='fnn_b2')
}
parameters2 = {**weights2, **biases2}
init_params2 = tf.variables_initializer(parameters2.values())
sess.run(init_params2)

# Construct model
encoder_op = encoder(X)
focusFnn_op = focusFnn(encoder_op)  # for one gene a time prediction
decoder_op = decoder(encoder_op)  # for pearson correlation of the whole matrix #bug (8092, 0)

# Prediction and truth
y_pred = focusFnn_op  # [m, 1]
y_true = X[:, j]
y_benchmark = M[:, j]  # benchmark for cost_fnn
M_train = df2_train.values[:, j:j+1]  # benchmark for corr
M_valid = df2_valid.values[:, j:j+1]

# Define loss and optimizer, minimize the squared error
with tf.name_scope("Metrics"):
    cost_fnn = tf.reduce_mean(tf.pow(y_true - y_pred, 2))
    cost_fnn_benchmark = tf.reduce_mean(tf.pow(y_pred- y_benchmark, 2))
    cost_decoder = tf.reduce_mean(tf.pow(X - decoder_op, 2))
    cost_decoder_benchmark = tf.reduce_mean(tf.pow(decoder_op - M, 2))
    tf.summary.scalar('cost_fnn', cost_fnn)
    tf.summary.scalar('cost_fnn_benchmark', cost_fnn_benchmark)
    tf.summary.scalar('cost_decoder', cost_decoder)
    tf.summary.scalar('cost_decoder_benchmark', cost_decoder_benchmark)

# optimizer = (
#     tf.train.GradientDescentOptimizer(learning_rate).
#     minimize(cost_fnn, var_list=[list(weights2.values()), list(biases2.values())])
# )# frozen other variables

optimizer = (
    tf.train.GradientDescentOptimizer(learning_rate).
    minimize(cost_fnn)
)# frozen other variables
print("# Updated layers: ", "fnn layers\n")

train_writer = tf.summary.FileWriter(log_dir+'/train', sess.graph)
valid_writer = tf.summary.FileWriter(log_dir+'/valid', sess.graph)
# benchmark_writer = tf.summary.FileWriter(log_dir+'/benchmark', sess.graph)

# Evaluate the init network
[cost_train, h_train] = sess.run([cost_fnn, y_pred], feed_dict={X: df_train.values})
[cost_valid, h_valid] = sess.run([cost_fnn, y_pred], feed_dict={X: df_valid.values})

6 个答案:

答案 0 :(得分:0)

beta变量是AdamOptimizer使用的变量,需要与其他变量类似地初始化。您可以在创建优化程序后使用tf.global_variables_initializer()初始化它们,或者查找变量并使用tf.variables_initializer()直接初始化它们。

答案 1 :(得分:0)

在我们的评论交换之后,不确定你是否有任何地方,但我遇到了同样的问题并通过运行解决了它:

with tf.Session(graph=graph) as sess:
    sess.run(sess.graph.get_tensor_by_name('beta1_power/Assign:0'))
    sess.run(sess.graph.get_tensor_by_name('beta2_power/Assign:0'))
    ...

答案 2 :(得分:0)

我通过保存所有变量来解决问题,而不是仅在步骤1中保存权重和偏差。在第2步中,首先定义train_op,然后定义saver.restore()

答案 3 :(得分:0)

我通过以下操作解决了该问题:

    /* Grey out out of stock items in the product dropdown */
add_filter( 'woocommerce_variation_is_active', 'grey_out_variations_when_out_of_stock', 10, 2 );

function grey_out_variations_when_out_of_stock( $grey_out, $variation ) {

   if ( ! $variation->is_in_stock() )
        return false;

    return true;
}

似乎可行。首先我初始化了全局变量,然后加载了预训练的权重。注意,您必须事先定义AdamOptimizer,然后运行 train_op = tf.train.AdamOptimizer(0.001).minimize(loss) # Initialize the variables and reload weights sess.run(tf.global_variables_initializer()) net.load_initial_weights(sess) ,否则运行{ {1}}仍不会初始化

我的大四生刚刚教过我。非常感谢他。

答案 4 :(得分:0)

我通过添加以下内容来解决它:

tf.reset_default_graph() 

在图形构造之前。

但是,您仍然必须确保在创建Saver之前已完成 solver 声明:

solver = tf.train.AdamOptimizer().minimize(loss)
...
saver = tf.train.Saver()

参考: http://stackoverflow.com/questions/33765336/remove-nodes-from-graph-or-reset-entire-default-graph https://github.com/tflearn/tflearn/blob/master/examples/basics/weights_loading_scope.py

答案 5 :(得分:0)

必须先初始化Adam优化器的变量,然后才能使用它。 要初始化Adam优化器:

optimizer = tf.train.AdamOptimizer()
sess.run(tf.variables_initializer(optimizer.variables()))
相关问题