caffe hdf5 H5LTfind_dataset(file_id,dataset_name_)无法找到HDF5数据集

时间:2016-10-15 12:32:53

标签: machine-learning deep-learning caffe hdf5

我使用HDF5作为输入caffe的输入之一,hdf5文件只包含一些重量信息放在sigmoidcrossentropyloss层中,因此它不包含任何jQuery。发生此错误:< / p>

label

searched这个问题,我的hdf5文件似乎需要一个数据集标签,但事实是我不需要这样做。我只需要一个 I1015 07:08:54.605777 17909 net.cpp:100] Creating Layer weight28 I1015 07:08:54.605797 17909 net.cpp:408] weight28 -> weight28 I1015 07:08:54.605834 17909 hdf5_data_layer.cpp:79] Loading list of HDF5 filenames from: /home/zhangyu/codes/unsupervised/data/weight28.txt I1015 07:08:54.605926 17909 hdf5_data_layer.cpp:93] Number of HDF5 files: 1 F1015 07:08:54.608682 17909 hdf5.cpp:14] Check failed: H5LTfind_dataset(file_id, dataset_name_) Failed to find HDF5 dataset weight28 *** Check failure stack trace: *** @ 0x7f17077ec9fd google::LogMessage::Fail() @ 0x7f17077ee89d google::LogMessage::SendToLog() @ 0x7f17077ec5ec google::LogMessage::Flush() @ 0x7f17077ef1be google::LogMessageFatal::~LogMessageFatal() @ 0x7f1707e4d774 caffe::hdf5_load_nd_dataset_helper<>() @ 0x7f1707e4bcf0 caffe::hdf5_load_nd_dataset<>() @ 0x7f1707e8fd78 caffe::HDF5DataLayer<>::LoadHDF5FileData() @ 0x7f1707e8ebf8 caffe::HDF5DataLayer<>::LayerSetUp() @ 0x7f1707e283b2 caffe::Net<>::Init() @ 0x7f1707e2ad85 caffe::Net<>::Net() @ 0x7f1707e6da5f caffe::Solver<>::InitTrainNet() @ 0x7f1707e6df7b caffe::Solver<>::Init() @ 0x7f1707e6e3e8 caffe::Solver<>::Solver() @ 0x7f1707e865a3 caffe::Creator_SGDSolver<>() @ 0x4116b1 caffe::SolverRegistry<>::CreateSolver() @ 0x40ac56 train() @ 0x406e32 main @ 0x7f17066adf45 (unknown) @ 0x4074b6 (unknown) 的数据集。并将其作为重量输入损失层。这是我的h5文件:

91250x28x28

我修改了sigmiodcrossentropy图层,将其添加为第三个底层:

HDF5 weight28.h5 
Group '/' 
    Dataset 'data' 
        Size:  2555000x28
        MaxSize:  Infx28
        Datatype:   H5T_IEEE_F64LE (double)
        ChunkSize:  28x28
        Filters:  none
        FillValue:  0.000000

原型文件:

// modified here
  const Dtype* pixelweights = bottom[2]->cpu->data();

  Dtype loss = 0;
  for (int i = 0; i < count; ++i) {
    loss -= pixelweights(i)*(input_data[i] * (target[i] - (input_data[i] >= 0)) -
            log(1 + exp(input_data[i] - 2 * input_data[i] * (input_data[i] >= 0))));
  }
  top[0]->mutable_cpu_data()[0] = loss / num;
}

我预计h5文件中的一批数据将作为layer { name: "loss_G" type: "SigmoidCrossEntropyLoss" bottom: "global_smR" bottom: "mask28" bottom: "weight28" //Added it here top: "loss_G" } 读取到网络中(sizes批量大小* 28 * 28)。 这是个问题。

  • 我可以从上面的代码中得到我的预期吗?
  • 我是否必须在h5文件中添加标签集才能解决错误?
  • 如果我将它添加到h5文件中,我该如何处理损失层中的额外标签数据?

任何建议都会被评估,谢谢!

1 个答案:

答案 0 :(得分:1)

Your "HDF5Data" has top named "weight28", but your h5 file has only dataset "data". The "top" of "HDF5Data" layer must be the same as the Dataset name stored in the h5 file. If you have more than one dataset dtored in the same file, you can have multiple tops with the names of the Datasets in the h5 file.