Caffe测试网与图像作为标签

时间:2017-12-04 08:53:22

标签: python neural-network computer-vision deep-learning caffe

问题

我尝试创建一个CNN,其中我使用图像作为标签,值介于0和1之间。经过一些训练后,我的网络丢失了大约23个。现在我想看到结果。为此我使用这个python脚本:

import caffe
import numpy as np
from PIL import Image

net = caffe.Net('D:/caffe/net.prototxt',
            'D:/caffe/net_iter_35000.caffemodel',
            caffe.TEST)

# load input and configure preprocessing
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})

transformer.set_mean('data', np.load('train_mean.npy').mean(1).mean(1))
transformer.set_transpose('data', (2,0,1))
transformer.set_channel_swap('data', (2,1,0))
transformer.set_raw_scale('data', 255.0)

#note we can change the batch size on-the-fly 
#since we classify only one image, we change batch size from 10 to 1
net.blobs['data'].reshape(1,3,360,360)

#load the image in the data layer
im = caffe.io.load_image('train/img0.png')
net.blobs['data'].data[...] = transformer.preprocess('data', im)

#compute
out = net.forward()

result = out['conv7'][0][0]

现在我期待结果的值大约介于0和1之间。但实际上result.max()返回5.92而result.min()返回-4315.5。

python脚本中是否有错误,或者这个值是否正常丢失23?

其他信息

我的train_test.prototxt:

layer {
  name: "mynet"
  type: "Data"
  top: "data0"
  top: "label0"
  include {
    phase: TRAIN
  }
  transform_param {
    mean_file: "train_mean.binaryproto"
    scale: 0.00390625
  }
  data_param {
    source: "train_lmdb"
    batch_size: 32
    backend: LMDB
  }
}

layer {
  name: "mynetlabel"
  type: "Data"
  top: "data1"
  top: "label1"
  include {
    phase: TRAIN
  }
  transform_param {
    scale: 0.00390625
  }
  data_param {
    source: "train_label_lmdb_2"
    batch_size: 32
    backend: LMDB
  }
}

layer {
  name: "mnist"
  type: "Data"
  top: "data0"
  top: "label0"
  include {
    phase: TEST
  }
  transform_param {
    mean_file: "train_mean.binaryproto"
    scale: 0.00390625
  }
  data_param {
    source: "val_lmdb"
    batch_size: 16
    backend: LMDB
  }
}
layer {
  name: "mnistlabel"
  type: "Data"
  top: "data1"
  top: "label1"
  include {
    phase: TEST
  }
  transform_param {
    scale: 0.00390625
  }
  data_param {
    source: "val_label_lmdb_2"
    batch_size: 16
    backend: LMDB
  }
}
.
. 
.
layer {
  name: "conv7"
  type: "Convolution"
  bottom: "conv6"
  top: "conv7"
  param {
    lr_mult: 5.0
    decay_mult: 1.0
  }
  param {
    lr_mult: 10.0
    decay_mult: 0.0
  }
  convolution_param {
    num_output: 1
    pad: 0
    kernel_size: 1
    weight_filler {
      type: "gaussian"
      std: 0.00999999977648
    }
    bias_filler {
      type: "constant"
    }
  }
}

layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "conv7"
  bottom: "data1"
  top: "accuracy"
  include {
    phase: TEST
  }
}

layer {
  name: "loss"
  type: "SigmoidCrossEntropyLoss"
  bottom: "conv7"
  bottom: "data1"
  top: "loss"
}

我的net.prototxt:

layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param { shape: { dim: 50 dim: 3 dim: 360 dim: 360 } }
  transform_param {
    scale: 0.00390625
  }
}
.
.
.
layer {
  name: "conv7"
  type: "Convolution"
  bottom: "conv6"
  top: "conv7"
  param {
    lr_mult: 5.0
    decay_mult: 1.0
  }
  param {
    lr_mult: 10.0
    decay_mult: 0.0
  }
  convolution_param {
    num_output: 1
    pad: 0
    kernel_size: 1
    weight_filler {
      type: "gaussian"
      std: 0.00999999977648
    }
    bias_filler {
      type: "constant"
    }
  }
}

1 个答案:

答案 0 :(得分:2)

您的train_val.prototxt使用"SigmoidWithCrossEntropy",正如此图层的名称所示,它包含(内部)"Sigmoid"图层和交叉熵丢失。因此,在部署网络时,您应该使用net.prototxt文件中的"Sigmoid"图层替换此图层。
有关详细信息,请参阅this answer

PS,
caffe不支持使用"Accuracy"图层进行单个二进制输出:"Accuracy"图层假设预测的维度等于类的数量(适用于"SoftmaxWithLoss")。在您的情况下,您有两个标签{0, 1},但输出的暗淡只有1.有关详细信息,请参阅this answer