如何显示tf.Variable

时间:2018-10-03 23:13:10

标签: tensorflow

我是Tensorflow的新手,我很高兴知道如何可视化tf的内容。变量,尝试过的%f,%s但未显示,这是我的错误。 我放置了我正在使用的代码,感谢您的答复。

import tensorflow as tf
sess = tf.Session()
init = tf.global_variables_initializer()
sess.run(init)
startIter = 2000
globalStep = tf.Variable(startIter, trainable=False)
x = tf.Variable(5.0, name="counter") 
for i in range(startIter): 
    totalLoss = x**2-20.0*x+1.0 
    opt = tf.train.AdamOptimizer(learning_rate=0.0001) 
    grads = opt.compute_gradients(totalLoss)
    grads = [(None, var) if grad is None else (tf.clip_by_value(grad, -1.0, 1.0), var) for grad, var in grads]
    applyGradientOp = opt.apply_gradients(grads, global_step=globalStep)
    #print("opt.get_name(): ",opt.get_name(),"opt._lr: ",opt._lr,"opt._lr_t: %f "% (sess.run(opt._lr_t)))  #jll1
    print("opt.get_slot_names: ",opt.get_slot_names())
    print('  ', opt.get_slot(var,'m'))  # here
    print('  ', opt.get_slot(var,'v'))  # here
    assign_op = tf.assign(x, x + 1)

显示此结果

('opt.get_slot_names: ', ['m', 'v'])
('  ', <tf.Variable 'counter/Adam_614:0' shape=() dtype=float32_ref>)
('  ', <tf.Variable 'counter/Adam_615:0' shape=() dtype=float32_ref>)

但是我想可视化一个值,当然,如果可能的话。

我知道它们是AdamOptimizer插槽,我试图显示每一步的学习率。我已经审查了其他答案,但它们不起作用。 使用:

print ("opt.get_name ():", opt.get_name (), "opt._lr:", opt._lr, "opt._lr_t:", opt._lr_t) # jll1 

打印之前,结果相同。

3 个答案:

答案 0 :(得分:0)

要查看tf.Variable中的值,您需要在TensorFlow会话中运行它。这应该起作用:

print('  ', sess.run(opt.get_slot(var,'m')))  # here
print('  ', sess.run(opt.get_slot(var,'v')))  # here

答案 1 :(得分:0)

您可以使用<table id="myTable2"> <thead> <tr> <th>会社名</th> <th>物件名</th> <th>所在地</th> <th>販売価額</th> <th>総戸数</th> <th>間取り</th> <th>専有面積</th> <th>バルコニー面積</th> <th>竣工年月日</th> <th>入居年月日</th> </tr> <thead> <tbody> @foreach($estates as $estate) <tr> <td>{{$estate->company_name}}</td> $links = json_decode($estate->link); foreach($links as $link){ <td><a href="{{$link}}" } target="_blank">{{$estate->name}}</a></td> <td>{{$estate->address}}</td> <td>{{$estate->price}}</td> <td>{{$estate->hows_old}}</td> <td>{{$estate->extend}}</td> <td>{{$estate->rooms}}</td> <td>{{$estate->balcon_m2}}</td> <td>{{$estate->old}}</td> <td>{{$estate->entery}}</td> </tr> @endforeach </tbody> <table/> ,但这需要您将其添加到计算图中:

tf.Print()

答案 2 :(得分:0)

非常感谢您的帮助,我重新排列了代码。

import tensorflow as tf
sess = tf.Session()   #jll2
startIter = 2000
globalStep = tf.Variable(startIter, trainable=False)
x = tf.Variable(5.0, name="counter")
for i in range(startIter): 
    totalLoss = x**2-20.0*x+1.0  
    opt = tf.train.AdamOptimizer(learning_rate=0.0001)  
#    print("opt.get_name(): ",opt.get_name(),"opt._lr: ",opt._lr,"opt._lr_t: ",opt._lr_t)  #jll1
    grads = opt.compute_gradients(totalLoss)
    grads = [(None, var) if grad is None else (tf.clip_by_value(grad, -1.0, 1.0), var) for grad, var in grads]
    applyGradientOp = opt.apply_gradients(grads, global_step=globalStep)
#    print('Learning rate opt._lr_t: %f' % (sess.run(opt._lr_t)))    #jll3             
#    print("opt.get_name(): ",opt.get_name(),"opt._lr: ",opt._lr,"opt._lr_t: %f "% (sess.run(opt._lr_t)))  #jll1
    print("opt.get_slot_names: ",opt.get_slot_names())
    #### **by matwilso**
    with sess.as_default():
        sess.run(tf.global_variables_initializer())
        print('m:',sess.run(opt.get_slot(var,'m')),'v:',sess.run(opt.get_slot(var,'v')))
    #### **by Andreas Pasternak**
    with sess.as_default():
        sess.run(tf.global_variables_initializer())
        print_node1 = tf.Print(opt.get_slot(var,'m'), [opt.get_slot(var,'m')], 'm')
        print_node2 = tf.Print(opt.get_slot(var,'v'), [opt.get_slot(var,'v')], 'v')
        print("['m', 'v']:",sess.run([print_node1,print_node2]))
    assign_op = tf.assign(x, x + 1)
# thanks

现在我得到了:

('opt.get_slot_names: ', ['m', 'v'])
('m:', 0.0, 'v:', 0.0)
("['m', 'v']:", [0.0, 0.0])