我正在使用Silo和HDF5,而我无法使用h5py访问某些元数据。它吐出了一些相当不寻常的HDF5结构,它将DATATYPE
置于DATATYPE
内。以下是h5dump
输出的摘录:
DATATYPE "sigma_t" H5T_STD_I32LE;
ATTRIBUTE "silo" {
DATATYPE H5T_COMPOUND {
H5T_STRING {
STRSIZE 5;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
} "meshid";
H5T_STRING {
STRSIZE 15;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
} "value0";
H5T_STD_I32LE "ndims";
H5T_STD_I32LE "nvals";
H5T_STD_I32LE "nels";
H5T_IEEE_F32LE "time";
H5T_STD_I32LE "use_specmf";
H5T_STD_I32LE "centering";
H5T_ARRAY { [3] H5T_STD_I32LE } "dims";
H5T_ARRAY { [3] H5T_STD_I32LE } "zones";
H5T_ARRAY { [3] H5T_STD_I32LE } "min_index";
H5T_ARRAY { [3] H5T_STD_I32LE } "max_index";
H5T_ARRAY { [3] H5T_IEEE_F32LE } "align";
}
DATASPACE SCALAR
DATA {
(0): {
"mesh",
"/.silo/#000004",
2,
1,
100,
0,
-1000,
111,
[ 10, 10, 0 ],
[ 9, 9, 0 ],
[ 0, 0, 0 ],
[ 9, 9, 0 ],
[ 0.5, 0.5, 0 ]
}
}
}
ATTRIBUTE "silo_type" {
DATATYPE H5T_STD_I32LE
DATASPACE SCALAR
DATA {
(0): 501
}
}
基本上,f['sigma_t'].attrs['silo']
会返回一个tuple
,其中包含所有格式正确的数据,但没有任何相关的数据类型标签。 (我需要知道名字meshid
,value0
等等。)有没有办法解决这个问题?我很茫然。
HDF5 file包含“sigma_t”字段,实际数据存储在/.silo/#000004
中。
脚本:
import h5py
f = h5py.File('xsn.silo', 'r')
print f['sigma_t'].attrs['silo']
结果:
('mesh', '/.silo/#000004', 2, 1, 100, 0.0, -1000, 111, array([10, 10, 0], dtype=int32), array([9, 9, 0], dtype=int32), array([0, 0, 0], dtype=int32), array([9, 9, 0], dtype=int32), array([ 0.5, 0.5, 0. ], dtype=float32))
我还想要的是:
('meshid','value0','ndims', ..., 'align')
这可能吗?
答案 0 :(得分:3)
我通过the h5py Google groups page得到了开发人员的回答:这是一个将在h5py 1.4中修复的错误。
我最终做的是:
import h5py
f = h5py.File('xsn.silo', 'r')
group = f['sigma_t']
attr_id = h5py.h5a.open(group.id, 'silo')
data = dict(zip(attr_id.dtype.names, group.attrs['silo'],))
答案 1 :(得分:0)
感谢您回答Seth!你的回答对我有所帮助,但这可能会让它更容易一些
#path of table that you want to look at
group = f[path]
#checking attributes leads to FIELD_0_NAME or TITLE
for attribute in group.attrs:
#I only one the ones that end with name
if attribute.endswith('NAME'):
#then I take the actual name (ex:TrialTime) instead of FIELD_0_NAME
print group.attrs[attribute]