我正在尝试删除HDF5文件中的数据集。具体来说,我试图从我之前训练和保存的keras深度学习模型中删除优化器层。
代码如下
f = h5py.File('model.h5', 'r+')
del f['optimizer_weights']
f.close()
错误是
KeyError: "Couldn't delete link (Can't delete self)
详细错误消息
del f['optimizer_weights']
File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (D:\Build\h5py\h5py-2.7.0\h5py\_objects.c:2853)
File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (D:\Build\h5py\h5py-2.7.0\h5py\_objects.c:2811)
File "C:\Users\Anaconda3\envs\tensorflow-keras-gpu\lib\site-packages\h5py\_hl\group.py", line 297, in __delitem__
self.id.unlink(self._e(name))
File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (D:\Build\h5py\h5py-2.7.0\h5py\_objects.c:2853)
File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (D:\Build\h5py\h5py-2.7.0\h5py\_objects.c:2811)
File "h5py\h5g.pyx", line 294, in h5py.h5g.GroupID.unlink (D:\Build\h5py\h5py-2.7.0\h5py\h5g.c:4179)
KeyError: "Couldn't delete link (Can't delete self)"
有关如何解决这个问题的任何建议吗?
谢谢!
答案 0 :(得分:1)
你确定数据集实际上在那里吗?在尝试删除不存在的数据集时,我遇到了这个错误的错误。
def printname(name):
print(name)
f.visit(printname)
# list of datasets, should contain 'dataset_name'
在代码中,您遗憾的是每次删除前都需要检查是否存在。要“覆盖”可能已存在的数据集:
with h5py.File('/path/to/h5', 'a') as f:
if f.get('dataset_name'):
del f['dataset_name']
f['dataset_name'] = 'new value'
答案 1 :(得分:0)
相当古老的问题,但请尝试将文件置于写入或追加模式
f = h5py.File('model.h5', 'a')