文件" test_hdfs.py",save_path = saver.save(sess,hdfs_path +" save_net.ckpt")" {}的父目录不存在,可以& #39; t save。" .format(save_path))

时间:2017-11-23 13:13:38

标签: python hadoop tensorflow hdfs

如何使用saver.save和FileWriter函数直接将检查点文件和事件日志写入hdfs?
我运行我的代码:

W = tf.Variable([[1,2,3],[3,4,5]], dtype=tf.float32, name='weights')
b = tf.Variable([[1,2,3]], dtype=tf.float32, name='biases')
init = tf.global_variables_initializer()
saver = tf.train.Saver()

with tf.Session() as sess:
   sess.run(init)
   save_path = saver.save(sess, hdfs_path+"save_net.ckpt")
   print("Save to path: ", hdfs_path)

当我将hdfs_path替换为本地路径时,它运行正常。但是当我运行hdfs_path时:

File "test_hdfs.py", line 73, in <module>
    save_path = saver.save(sess, hdfs_path+"save_net.ckpt")
  File "/data/anaconda2/lib/python2.7/site-packages/tensorflow/python/training/saver.py", line 1354, in save
    "Parent directory of {} doesn't exist, can't save.".format(save_path))

当我使用tf.summary.FileWriter函数时,会发生类似情况。当我使用hdfs_path时程序被卡住了。当我使用local_path时,它运行正常。

我的整个代码是这样的:

hdfs_path="hdfs://*" 
local_path = "./" 
with tf.Session(graph=tf.get_default_graph()) as sess: 
    W = tf.Variable([[1,2,3],[3,4,5]], dtype=tf.float32, name='weights') 
    b = tf.Variable([[1,2,3]], dtype=tf.float32, name='biases') 
    init = tf.group(tf.global_variables_initializer(),tf.local_variables_initializer()) 
    saver = tf.train.Saver() 
    sess.run(init) 
    summary_writer = tf.summary.FileWriter(hdfs_path,graph_def=sess.graph_def) 
    saver.save(sess,save_path=hdfs_path+"save_net.ckpt") 

0 个答案:

没有答案