查看代码:
datetime.datetime.now(pytz.timezone(data["timezone"]))
当你运行它时,你会得到结果。
import tensorflow as tf
with tf.device('/gpu:0'):
with tf.device('/cpu:0'):
x = tf.constant(0,name='x')
x = x * 2
y = x + 2
config = tf.ConfigProto(log_device_placement=True)
with tf.Session(config=config) as sess:
sess.run(y)
这意味着mul: (Mul): /job:localhost/replica:0/task:0/cpu:0
2017-08-11 21:38:23.953846: I c:\tf_jenkins\home\workspace\release-win\m\windows
-gpu\py\35\tensorflow\core\common_runtime\simple_placer.cc:847] mul: (Mul)/job:l
ocalhost/replica:0/task:0/cpu:0
add: (Add): /job:localhost/replica:0/task:0/gpu:0
2017-08-11 21:38:23.954846: I c:\tf_jenkins\home\workspace\release-win\m\windows
-gpu\py\35\tensorflow\core\common_runtime\simple_placer.cc:847] add: (Add)/job:l
ocalhost/replica:0/task:0/gpu:0
add/y: (Const): /job:localhost/replica:0/task:0/gpu:0
2017-08-11 21:38:23.954846: I c:\tf_jenkins\home\workspace\release-win\m\windows
-gpu\py\35\tensorflow\core\common_runtime\simple_placer.cc:847] add/y: (Const)/j
ob:localhost/replica:0/task:0/gpu:0
mul/y: (Const): /job:localhost/replica:0/task:0/cpu:0
2017-08-11 21:38:23.954846: I c:\tf_jenkins\home\workspace\release-win\m\windows
-gpu\py\35\tensorflow\core\common_runtime\simple_placer.cc:847] mul/y: (Const)/j
ob:localhost/replica:0/task:0/cpu:0
x: (Const): /job:localhost/replica:0/task:0/cpu:0
2017-08-11 21:38:23.954846: I c:\tf_jenkins\home\workspace\release-win\m\windows
-gpu\py\35\tensorflow\core\common_runtime\simple_placer.cc:847] x: (Const)/job:l
ocalhost/replica:0/task:0/cpu:0
上运行的mul
和cpu
上运行add
。所以我得出gpu
。
当我查看Inception时,我很困惑。
where does ops or tensors define where does ops or tensors run
tower_loss在cpu上定义,这意味着根据我的结论,每个gpu都会在cpu上运行。但我认为每个gpu都应该运行一个副本 在gpu上。所以我误解了吗?