BatchNorm和Reshuffle在每个纪元后训练图像

时间:2016-07-25 19:27:24

标签: caffe conv-neural-network

使用BatchNorm的推荐方法是在每个纪元之间重新调整训练图像集,以便给定图像不会落入小批量中,每次传递时具有相同的图像。

你如何用Caffe实现这一目标?

1 个答案:

答案 0 :(得分:0)

如果您使用ImageData图层作为输入,请将“随机播放”设置为true

例如,如果你有:

layer {
  name: "data"
  type: "ImageData"
  top: "data"
  top: "label"
  transform_param {
    mirror: false
    crop_size: 227
    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto"
  }
  image_data_param {
    source: "examples/_temp/file_list.txt"
    batch_size: 50
    new_height: 256
    new_width: 256
  }
}

只需添加:

layer {
  name: "data"
  type: "ImageData"
  top: "data"
  top: "label"
  transform_param {
    mirror: false
    crop_size: 227
    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto"
  }
  image_data_param {
    source: "examples/_temp/file_list.txt"
    batch_size: 50
    new_height: 256
    new_width: 256
    shuffle: true
  }
}

有关文档,请参阅:

您也可以在此处找到源代码:

特别感兴趣的是函数load_batch中的代码,它在每个时代结束时重新混洗数据:

lines_id_++;
if (lines_id_ >= lines_size) {
  // We have reached the end. Restart from the first.
  DLOG(INFO) << "Restarting data prefetching from start.";
  lines_id_ = 0;
  if (this->layer_param_.image_data_param().shuffle()) {
    ShuffleImages();
  }
}