Caffe没有加载trianed重量。每次输出相同的类

时间:2017-11-13 08:59:46

标签: python-3.5 caffe pycaffe

我正在尝试从here重新创建视觉情绪预测模型。我无法得到正确的结果。 net_full_conv.forward_all(data=np.asarray([transformer.preprocess('data', image)])),它为每个图像输入提供相同的结果,即[[0.5, 0.5]](我认为是初始权重)。

以下是我写的代码。

import caffe
import numpy as np

mean_file = 'ilsvrc_2012_mean.npy'
deploy_path = 'sentiment.prototxt'
model_path = 'sentiment.caffemodel'

net_full_conv = caffe.Net(deploy_path, model_path, caffe.TEST)
transformer = caffe.io.Transformer({'data': net_full_conv.blobs['data'].data.shape})
transformer.set_mean('data', np.load(mean_file).mean(1).mean(1))
transformer.set_transpose('data', (2, 0, 1))
transformer.set_channel_swap('data', (2, 1, 0))
transformer.set_raw_scale('data', 255.0)

image = caffe.io.load_image('test.jpg')
out = net_full_conv.forward_all(data=np.asarray([transformer.preprocess('data', image)]))
print(out['prob'])
从项目link下载了

sentiment.prototxtsentiment.caffemodelilsvrc_2012_mean.npy已从 BVLC/caffe 存储库下载。

注意:使用的Python版本= 3.5.2,操作系统:Ubuntu 16.04

编辑:以下是caffe加载日志:

WARNING: Logging before InitGoogleLogging() is written to STDERR
W1113 18:37:25.260143 12638 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W1113 18:37:25.260186 12638 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W1113 18:37:25.260190 12638 _caffe.cpp:142] Net('sentiment.prototxt', 1, weights='sentiment.caffemodel')
I1113 18:37:25.261277 12638 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: sentiment.prototxt
I1113 18:37:25.261302 12638 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W1113 18:37:25.261306 12638 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I1113 18:37:25.516553 12638 net.cpp:51] Initializing net from parameters: 
name: "MVSOCaffeNet_Twitter"
state {
  phase: TEST
  level: 0
}
layer {
  name: "input"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc8_twitter"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc8_twitter"
  inner_product_param {
    num_output: 2
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8_twitter"
  top: "prob"
}
I1113 18:37:25.516752 12638 layer_factory.hpp:77] Creating layer input
I1113 18:37:25.516782 12638 net.cpp:84] Creating Layer input
I1113 18:37:25.516785 12638 net.cpp:380] input -> data
I1113 18:37:25.516803 12638 net.cpp:122] Setting up input
I1113 18:37:25.516825 12638 net.cpp:129] Top shape: 10 3 227 227 (1545870)
I1113 18:37:25.516847 12638 net.cpp:137] Memory required for data: 6183480
I1113 18:37:25.516860 12638 layer_factory.hpp:77] Creating layer conv1
I1113 18:37:25.516880 12638 net.cpp:84] Creating Layer conv1
I1113 18:37:25.516885 12638 net.cpp:406] conv1 <- data
I1113 18:37:25.516888 12638 net.cpp:380] conv1 -> conv1
I1113 18:37:25.913112 12638 net.cpp:122] Setting up conv1
I1113 18:37:25.913149 12638 net.cpp:129] Top shape: 10 96 55 55 (2904000)
I1113 18:37:25.913152 12638 net.cpp:137] Memory required for data: 17799480
I1113 18:37:25.913185 12638 layer_factory.hpp:77] Creating layer relu1
I1113 18:37:25.913197 12638 net.cpp:84] Creating Layer relu1
I1113 18:37:25.913218 12638 net.cpp:406] relu1 <- conv1
I1113 18:37:25.913223 12638 net.cpp:367] relu1 -> conv1 (in-place)
I1113 18:37:25.913389 12638 net.cpp:122] Setting up relu1
I1113 18:37:25.913395 12638 net.cpp:129] Top shape: 10 96 55 55 (2904000)
I1113 18:37:25.913398 12638 net.cpp:137] Memory required for data: 29415480
I1113 18:37:25.913419 12638 layer_factory.hpp:77] Creating layer pool1
I1113 18:37:25.913424 12638 net.cpp:84] Creating Layer pool1
I1113 18:37:25.913427 12638 net.cpp:406] pool1 <- conv1
I1113 18:37:25.913431 12638 net.cpp:380] pool1 -> pool1
I1113 18:37:25.913439 12638 net.cpp:122] Setting up pool1
I1113 18:37:25.913444 12638 net.cpp:129] Top shape: 10 96 27 27 (699840)
I1113 18:37:25.913446 12638 net.cpp:137] Memory required for data: 32214840
I1113 18:37:25.913450 12638 layer_factory.hpp:77] Creating layer norm1
I1113 18:37:25.913455 12638 net.cpp:84] Creating Layer norm1
I1113 18:37:25.913457 12638 net.cpp:406] norm1 <- pool1
I1113 18:37:25.913461 12638 net.cpp:380] norm1 -> norm1
I1113 18:37:25.913610 12638 net.cpp:122] Setting up norm1
I1113 18:37:25.913617 12638 net.cpp:129] Top shape: 10 96 27 27 (699840)
I1113 18:37:25.913620 12638 net.cpp:137] Memory required for data: 35014200
I1113 18:37:25.913642 12638 layer_factory.hpp:77] Creating layer conv2
I1113 18:37:25.913650 12638 net.cpp:84] Creating Layer conv2
I1113 18:37:25.913652 12638 net.cpp:406] conv2 <- norm1
I1113 18:37:25.913656 12638 net.cpp:380] conv2 -> conv2
I1113 18:37:25.915510 12638 net.cpp:122] Setting up conv2
I1113 18:37:25.915525 12638 net.cpp:129] Top shape: 10 256 27 27 (1866240)
I1113 18:37:25.915547 12638 net.cpp:137] Memory required for data: 42479160
I1113 18:37:25.915575 12638 layer_factory.hpp:77] Creating layer relu2
I1113 18:37:25.915582 12638 net.cpp:84] Creating Layer relu2
I1113 18:37:25.915585 12638 net.cpp:406] relu2 <- conv2
I1113 18:37:25.915590 12638 net.cpp:367] relu2 -> conv2 (in-place)
I1113 18:37:25.915998 12638 net.cpp:122] Setting up relu2
I1113 18:37:25.916007 12638 net.cpp:129] Top shape: 10 256 27 27 (1866240)
I1113 18:37:25.916028 12638 net.cpp:137] Memory required for data: 49944120
I1113 18:37:25.916030 12638 layer_factory.hpp:77] Creating layer pool2
I1113 18:37:25.916055 12638 net.cpp:84] Creating Layer pool2
I1113 18:37:25.916059 12638 net.cpp:406] pool2 <- conv2
I1113 18:37:25.916064 12638 net.cpp:380] pool2 -> pool2
I1113 18:37:25.916083 12638 net.cpp:122] Setting up pool2
I1113 18:37:25.916087 12638 net.cpp:129] Top shape: 10 256 13 13 (432640)
I1113 18:37:25.916090 12638 net.cpp:137] Memory required for data: 51674680
I1113 18:37:25.916111 12638 layer_factory.hpp:77] Creating layer norm2
I1113 18:37:25.916119 12638 net.cpp:84] Creating Layer norm2
I1113 18:37:25.916121 12638 net.cpp:406] norm2 <- pool2
I1113 18:37:25.916126 12638 net.cpp:380] norm2 -> norm2
I1113 18:37:25.916296 12638 net.cpp:122] Setting up norm2
I1113 18:37:25.916303 12638 net.cpp:129] Top shape: 10 256 13 13 (432640)
I1113 18:37:25.916306 12638 net.cpp:137] Memory required for data: 53405240
I1113 18:37:25.916327 12638 layer_factory.hpp:77] Creating layer conv3
I1113 18:37:25.916333 12638 net.cpp:84] Creating Layer conv3
I1113 18:37:25.916337 12638 net.cpp:406] conv3 <- norm2
I1113 18:37:25.916340 12638 net.cpp:380] conv3 -> conv3
I1113 18:37:25.918207 12638 net.cpp:122] Setting up conv3
I1113 18:37:25.918225 12638 net.cpp:129] Top shape: 10 384 13 13 (648960)
I1113 18:37:25.918227 12638 net.cpp:137] Memory required for data: 56001080
I1113 18:37:25.918267 12638 layer_factory.hpp:77] Creating layer relu3
I1113 18:37:25.918285 12638 net.cpp:84] Creating Layer relu3
I1113 18:37:25.918289 12638 net.cpp:406] relu3 <- conv3
I1113 18:37:25.918294 12638 net.cpp:367] relu3 -> conv3 (in-place)
I1113 18:37:25.918445 12638 net.cpp:122] Setting up relu3
I1113 18:37:25.918452 12638 net.cpp:129] Top shape: 10 384 13 13 (648960)
I1113 18:37:25.918473 12638 net.cpp:137] Memory required for data: 58596920
I1113 18:37:25.918475 12638 layer_factory.hpp:77] Creating layer conv4
I1113 18:37:25.918500 12638 net.cpp:84] Creating Layer conv4
I1113 18:37:25.918503 12638 net.cpp:406] conv4 <- conv3
I1113 18:37:25.918509 12638 net.cpp:380] conv4 -> conv4
I1113 18:37:25.921069 12638 net.cpp:122] Setting up conv4
I1113 18:37:25.921084 12638 net.cpp:129] Top shape: 10 384 13 13 (648960)
I1113 18:37:25.921087 12638 net.cpp:137] Memory required for data: 61192760
I1113 18:37:25.921113 12638 layer_factory.hpp:77] Creating layer relu4
I1113 18:37:25.921120 12638 net.cpp:84] Creating Layer relu4
I1113 18:37:25.921123 12638 net.cpp:406] relu4 <- conv4
I1113 18:37:25.921128 12638 net.cpp:367] relu4 -> conv4 (in-place)
I1113 18:37:25.921468 12638 net.cpp:122] Setting up relu4
I1113 18:37:25.921476 12638 net.cpp:129] Top shape: 10 384 13 13 (648960)
I1113 18:37:25.921479 12638 net.cpp:137] Memory required for data: 63788600
I1113 18:37:25.921501 12638 layer_factory.hpp:77] Creating layer conv5
I1113 18:37:25.921509 12638 net.cpp:84] Creating Layer conv5
I1113 18:37:25.921530 12638 net.cpp:406] conv5 <- conv4
I1113 18:37:25.921535 12638 net.cpp:380] conv5 -> conv5
I1113 18:37:25.923732 12638 net.cpp:122] Setting up conv5
I1113 18:37:25.923758 12638 net.cpp:129] Top shape: 10 256 13 13 (432640)
I1113 18:37:25.923761 12638 net.cpp:137] Memory required for data: 65519160
I1113 18:37:25.923791 12638 layer_factory.hpp:77] Creating layer relu5
I1113 18:37:25.923799 12638 net.cpp:84] Creating Layer relu5
I1113 18:37:25.923802 12638 net.cpp:406] relu5 <- conv5
I1113 18:37:25.923807 12638 net.cpp:367] relu5 -> conv5 (in-place)
I1113 18:37:25.923979 12638 net.cpp:122] Setting up relu5
I1113 18:37:25.923987 12638 net.cpp:129] Top shape: 10 256 13 13 (432640)
I1113 18:37:25.923990 12638 net.cpp:137] Memory required for data: 67249720
I1113 18:37:25.924011 12638 layer_factory.hpp:77] Creating layer pool5
I1113 18:37:25.924016 12638 net.cpp:84] Creating Layer pool5
I1113 18:37:25.924019 12638 net.cpp:406] pool5 <- conv5
I1113 18:37:25.924023 12638 net.cpp:380] pool5 -> pool5
I1113 18:37:25.924031 12638 net.cpp:122] Setting up pool5
I1113 18:37:25.924036 12638 net.cpp:129] Top shape: 10 256 6 6 (92160)
I1113 18:37:25.924037 12638 net.cpp:137] Memory required for data: 67618360
I1113 18:37:25.924041 12638 layer_factory.hpp:77] Creating layer fc6
I1113 18:37:25.924047 12638 net.cpp:84] Creating Layer fc6
I1113 18:37:25.924051 12638 net.cpp:406] fc6 <- pool5
I1113 18:37:25.924054 12638 net.cpp:380] fc6 -> fc6
I1113 18:37:25.960934 12638 net.cpp:122] Setting up fc6
I1113 18:37:25.960975 12638 net.cpp:129] Top shape: 10 4096 (40960)
I1113 18:37:25.960983 12638 net.cpp:137] Memory required for data: 67782200
I1113 18:37:25.961019 12638 layer_factory.hpp:77] Creating layer relu6
I1113 18:37:25.961030 12638 net.cpp:84] Creating Layer relu6
I1113 18:37:25.961050 12638 net.cpp:406] relu6 <- fc6
I1113 18:37:25.961069 12638 net.cpp:367] relu6 -> fc6 (in-place)
I1113 18:37:25.961421 12638 net.cpp:122] Setting up relu6
I1113 18:37:25.961432 12638 net.cpp:129] Top shape: 10 4096 (40960)
I1113 18:37:25.961436 12638 net.cpp:137] Memory required for data: 67946040
I1113 18:37:25.961457 12638 layer_factory.hpp:77] Creating layer drop6
I1113 18:37:25.961484 12638 net.cpp:84] Creating Layer drop6
I1113 18:37:25.961491 12638 net.cpp:406] drop6 <- fc6
I1113 18:37:25.961498 12638 net.cpp:367] drop6 -> fc6 (in-place)
I1113 18:37:25.961520 12638 net.cpp:122] Setting up drop6
I1113 18:37:25.961545 12638 net.cpp:129] Top shape: 10 4096 (40960)
I1113 18:37:25.961549 12638 net.cpp:137] Memory required for data: 68109880
I1113 18:37:25.961566 12638 layer_factory.hpp:77] Creating layer fc7
I1113 18:37:25.961572 12638 net.cpp:84] Creating Layer fc7
I1113 18:37:25.961594 12638 net.cpp:406] fc7 <- fc6
I1113 18:37:25.961601 12638 net.cpp:380] fc7 -> fc7
I1113 18:37:25.980587 12638 net.cpp:122] Setting up fc7
I1113 18:37:25.980612 12638 net.cpp:129] Top shape: 10 4096 (40960)
I1113 18:37:25.980617 12638 net.cpp:137] Memory required for data: 68273720
I1113 18:37:25.980628 12638 layer_factory.hpp:77] Creating layer relu7
I1113 18:37:25.980636 12638 net.cpp:84] Creating Layer relu7
I1113 18:37:25.980641 12638 net.cpp:406] relu7 <- fc7
I1113 18:37:25.980669 12638 net.cpp:367] relu7 -> fc7 (in-place)
I1113 18:37:25.981019 12638 net.cpp:122] Setting up relu7
I1113 18:37:25.981030 12638 net.cpp:129] Top shape: 10 4096 (40960)
I1113 18:37:25.981034 12638 net.cpp:137] Memory required for data: 68437560
I1113 18:37:25.981039 12638 layer_factory.hpp:77] Creating layer drop7
I1113 18:37:25.981046 12638 net.cpp:84] Creating Layer drop7
I1113 18:37:25.981050 12638 net.cpp:406] drop7 <- fc7
I1113 18:37:25.981055 12638 net.cpp:367] drop7 -> fc7 (in-place)
I1113 18:37:25.981079 12638 net.cpp:122] Setting up drop7
I1113 18:37:25.981084 12638 net.cpp:129] Top shape: 10 4096 (40960)
I1113 18:37:25.981087 12638 net.cpp:137] Memory required for data: 68601400
I1113 18:37:25.981092 12638 layer_factory.hpp:77] Creating layer fc8_twitter
I1113 18:37:25.981098 12638 net.cpp:84] Creating Layer fc8_twitter
I1113 18:37:25.981102 12638 net.cpp:406] fc8_twitter <- fc7
I1113 18:37:25.981108 12638 net.cpp:380] fc8_twitter -> fc8_twitter
I1113 18:37:25.981154 12638 net.cpp:122] Setting up fc8_twitter
I1113 18:37:25.981160 12638 net.cpp:129] Top shape: 10 2 (20)
I1113 18:37:25.981164 12638 net.cpp:137] Memory required for data: 68601480
I1113 18:37:25.981171 12638 layer_factory.hpp:77] Creating layer prob
I1113 18:37:25.981178 12638 net.cpp:84] Creating Layer prob
I1113 18:37:25.981182 12638 net.cpp:406] prob <- fc8_twitter
I1113 18:37:25.981201 12638 net.cpp:380] prob -> prob
I1113 18:37:25.981796 12638 net.cpp:122] Setting up prob
I1113 18:37:25.981806 12638 net.cpp:129] Top shape: 10 2 (20)
I1113 18:37:25.981830 12638 net.cpp:137] Memory required for data: 68601560
I1113 18:37:25.981837 12638 net.cpp:200] prob does not need backward computation.
I1113 18:37:25.981842 12638 net.cpp:200] fc8_twitter does not need backward computation.
I1113 18:37:25.981847 12638 net.cpp:200] drop7 does not need backward computation.
I1113 18:37:25.981851 12638 net.cpp:200] relu7 does not need backward computation.
I1113 18:37:25.981855 12638 net.cpp:200] fc7 does not need backward computation.
I1113 18:37:25.981859 12638 net.cpp:200] drop6 does not need backward computation.
I1113 18:37:25.981864 12638 net.cpp:200] relu6 does not need backward computation.
I1113 18:37:25.981868 12638 net.cpp:200] fc6 does not need backward computation.
I1113 18:37:25.981873 12638 net.cpp:200] pool5 does not need backward computation.
I1113 18:37:25.981878 12638 net.cpp:200] relu5 does not need backward computation.
I1113 18:37:25.981884 12638 net.cpp:200] conv5 does not need backward computation.
I1113 18:37:25.981887 12638 net.cpp:200] relu4 does not need backward computation.
I1113 18:37:25.981892 12638 net.cpp:200] conv4 does not need backward computation.
I1113 18:37:25.981897 12638 net.cpp:200] relu3 does not need backward computation.
I1113 18:37:25.981902 12638 net.cpp:200] conv3 does not need backward computation.
I1113 18:37:25.981907 12638 net.cpp:200] norm2 does not need backward computation.
I1113 18:37:25.981912 12638 net.cpp:200] pool2 does not need backward computation.
I1113 18:37:25.981917 12638 net.cpp:200] relu2 does not need backward computation.
I1113 18:37:25.981922 12638 net.cpp:200] conv2 does not need backward computation.
I1113 18:37:25.981927 12638 net.cpp:200] norm1 does not need backward computation.
I1113 18:37:25.981932 12638 net.cpp:200] pool1 does not need backward computation.
I1113 18:37:25.981937 12638 net.cpp:200] relu1 does not need backward computation.
I1113 18:37:25.981942 12638 net.cpp:200] conv1 does not need backward computation.
I1113 18:37:25.981947 12638 net.cpp:200] input does not need backward computation.
I1113 18:37:25.981951 12638 net.cpp:242] This network produces output prob
I1113 18:37:25.981966 12638 net.cpp:255] Network initialization done.
I1113 18:37:26.067855 12638 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: sentiment.caffemodel
I1113 18:37:26.067874 12638 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W1113 18:37:26.067878 12638 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I1113 18:37:26.069875 12638 net.cpp:744] Ignoring source layer fc6-conv
I1113 18:37:26.069881 12638 net.cpp:744] Ignoring source layer fc7-conv
I1113 18:37:26.069885 12638 net.cpp:744] Ignoring source layer fc8_twitter-conv

0 个答案:

没有答案