在另一个培训操作中运行培训操作

时间:2018-01-16 18:47:55

标签: python tensorflow machine-learning

我想在另一个培训操作中进行如下的小型培训操作:

def get_alphas(weights, filters):
    alphas = tf.Variable(...)
    # Define some loss and training_op here
    with tf.Session() as sess:
        for some_epochs:
            sess.run(training_op)
        return tf.convert_to_tensor(sess.run(alphas))

def get_updated_weights(default_weights):
    weights = tf.Variable(default_weights)
    # Some operation on weights to get filters

    # Now, the following will produce errors since weights is not initialized
    alphas = get_alphas(weights, filters)

    # Other option is to initialize it here as follows
    with tf.Session() as sess:
        sess.run(tf.variables_initializer([weights]))
        calculated_filters = sess.run(filters)
        alphas = get_alphas(default_weights, calculated_filters)

    return Some operation on alphas and filters

所以,我想要做的是创建一个名为weights的变量。 alphasfiltersweights上动态依赖(通过一些培训)。现在,当weights经过培训时,filters会在weights上通过某些操作创建时发生变化,但alphas也需要更改,只有另一个才能找到培训操作。

如果从上面不清楚意图,我将提供确切的功能。

1 个答案:

答案 0 :(得分:1)

您描述的技巧不起作用,因为tf.Session.close会释放所有相关资源,例如变量,队列和读者。因此get_alphas的结果将不是有效的张量。

最佳做法是定义多个损失和培训操作(影响图表的不同部分),并在需要时在单个会话中运行。

alphas = tf.Variable(...)
# Define some loss and training_op here

def get_alphas(sess, weights, filters):
  for some_epochs:
    sess.run(training_op)

# The rest of the training...