尽管在Tensorflow中分配了节点的变量的值,节点的值却没有改变

时间:2018-08-14 02:40:57

标签: tensorflow

我有一个简单的图形:

import tensorflow as tf
import numpy as np
np.random.seed(7)
tf.set_random_seed(7)

param = np.random.rand(7, 2)
shape = (7, 1)

with  tf.Graph().as_default() as graph:
    with tf.Session() as sess:
        mixture = tf.get_variable(
            name='mixture',
            dtype=tf.float32,
            shape=[2, 1],
            initializer=tf.initializers.ones,
            trainable=True)
        newparam = tf.reshape(tf.matmul(tf.constant(param, dtype=tf.float32), tf.nn.softmax(mixture)), shape)

        assign_op1 = tf.assign(mixture, [[10], [0]])
        assign_op2 = tf.assign(mixture, [[0], [10]])

        sess.run(tf.global_variables_initializer())
        init_param = sess.run(newparam)
        #now assign another value to mixture
        sess.run(assign_op1)
        second_param = sess.run(newparam)
        #now assign another v alue to mixture
        sess.run(assign_op2)
        third_param = sess.run(newparam)
        print(init_param == second_param)
        print(second_param == third_param)

我想将参数(常数)乘以混合参数(可训练的变量)。 我通过赋值来更改blend变量的值,但是结果表达式(newparam)的值完全不变。

我希望当我使用分配操作更改mixture变量的值时,节点newparam的值也应该更改,而不会更改。

我不确定是什么问题。该图的Tensorboard图也已附加。

enter image description here

更新: 我隔离了问题。 即使只有newaparam = tf.nn.softmax(mixture),当我更改mixture的值时,softmax的值也保持不变。不知道为什么!

import tensorflow as tf
import numpy as np

np.random.seed(7)
tf.set_random_seed(7)



with  tf.Graph().as_default() as graph:
    with tf.Session() as sess:
        mixture = tf.get_variable(
            name='mixture',
            dtype=tf.float32,
            shape=[2, 1],
            initializer=tf.initializers.ones,
            trainable=True)

        newparam = tf.nn.softmax(mixture)

        assign_op1 = tf.assign(mixture, [[9], [0]])
        assign_op2 = tf.assign(mixture, [[0], [9]])
        #train_writer = tf.summary.FileWriter('./board/' + 'graph',
        #                                     sess.graph)

        sess.run(tf.global_variables_initializer())
        init_param = sess.run(newparam)
        #now assign another value to mixture
        sess.run(assign_op1)
        second_param = sess.run(newparam)
        #now assign another v alue to mixture
        sess.run(assign_op2)
        third_param = sess.run(newparam)
        print(init_param == second_param)
        print(second_param == third_param)

==操作应该返回False,因为我改变了混合 通过分配操作,但tf.nn.softmax(mixture)保持不变 作为初始值,就好像它不再与混合物连接一样。

0 个答案:

没有答案