在时代结束时,每次更好的损失我都在减轻keras模型的权重
def save_best_weights_if_required(self, loss, acc):
if self.best_loss is None or loss<self.best_loss:
logger.info('Saving ''best_weights.h5''...')
self.save_weights('best_weights.h5')
self.best_acc = acc
self.best_loss = loss
def on_epoch_end(self, epoch, logs=None):
self.epoch_logs.append(logs)
self.save_best_weights_if_required(logs['val_loss'], logs['val_acc'])
如您所见,该函数使用相同的文件名多次调用。
首次保存最多需要几分钟才能保存。下次保存时,它会立即执行。
我也出错了
File "/opt/anaconda3/envs/py35/lib/python3.5/site-packages/keras/engine/network.py", line 1104, in save
save_model(self, filepath, overwrite, include_optimizer)
File "/opt/anaconda3/envs/py35/lib/python3.5/site-packages/keras/engine/saving.py", line 175, in save_model
'weight_names'] = weight_names
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/opt/anaconda3/envs/py35/lib/python3.5/site-packages/h5py/_hl/attrs.py", line 95, in __setitem__
self.create(name, data=value, dtype=base.guess_dtype(value))
File "/opt/anaconda3/envs/py35/lib/python3.5/site-packages/h5py/_hl/attrs.py", line 188, in create
attr = h5a.create(self._id, self._e(tempname), htype, space)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5a.pyx", line 47, in h5py.h5a.create
RuntimeError: Unable to create attribute (object header message is too large)
在以后的保存之一中。
怎么可能?如果问题出在节点名称上,那么如何可以第一次保存?
为什么第一次保存的时间更长?