我有一个经过重新训练的微型yolov3模型,已经转换为与Openvino兼容的IR模型。我在运行有NCS2的Intel i7-8750和运行NCS2的Raspberry Pi3的笔记本电脑上对这些模型进行了推断,并且正在使用C ++ API来实现。但是,输出节点字节大小之间不匹配这两个平台。
我正在使用下面的代码片段来读取输出节点blob及其大小,以创建输出缓冲区。
InferenceEngine::InputsDataMap output_info(network.getOutputsInfo());
for (auto &item : output_info) {
auto output_name = item.first;
std::cout<<"Output name :"<<item.first<<std::endl;
auto output = async_infer_request.GetBlob(output_name);
std::cout<<"output buffer size : "<<output->byteSize()<<std::endl;
output_buffer = output->buffer().as<PrecisionTrait<Precision::FP32>::value_type *>();
}
其中network是加载了模型的MYRIAD插件。
在装有NCS2的笔记本电脑上,我得到以下结果:
[ INFO ] InferenceEngine:
API version ............ 1.6
Build ..................custom_releases/2019/R1_c9b66a26e4d65bb986bb740e73f58c6e9e84c7c2
[ INFO ] Loading plugin
API version ............ 1.6
Build .................. 22443
Description ....... myriadPlugin
[ INFO ] Loading network files
[ INFO ] Batch size is forced to 1.
[ INFO ] Successfully loaded network files
inputs
inputDims=416 416 3 1
detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
outputDims=1 18 26 26
detector/yolo-v3-tiny/Conv_9/BiasAdd/YoloRegion
outputDims=1 18 13 13
Output data size : 3042
[ INFO ] Loading model to the plugin
[ INFO ] Loaded model to the plugin
[ INFO ] Creating an inference request from the network
[ INFO ] Created an inference request from the network
Input name :inputs
Input buffer size : 519168
Output name :detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
output buffer size : 48672
Output name :detector/yolo-v3-tiny/Conv_9/BiasAdd/YoloRegion
output buffer size : 12168
但是,在树莓派pi3上,我得到以下信息:
[ INFO ] InferenceEngine:
API version ............ 1.6
Build .................. 22443
[ INFO ] Loading plugin
API version ............ 1.6
Build .................. 22443
Description ....... myriadPlugin
[ INFO ] Loading network files
[ INFO ] Batch size is forced to 1.
[ INFO ] Successfully loaded network files
inputs
inputDims=416 416 3 1
detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
outputDims=1 18 26 26
detector/yolo-v3-tiny/Conv_9/BiasAdd/YoloRegion
outputDims=1 18 13 13
Output data size : 3042
[ INFO ] Loading model to the plugin
[ INFO ] Loaded model to the plugin
[ INFO ] Creating an inference request from the network
[ INFO ] Created an inference request from the network
Input name :inputs
Input buffer size : 519168
Output name :detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
terminate called after throwing an instance of 'InferenceEngine::details::InferenceEngineException'
what(): The output blob size is not equal to the network output size: got 12168 expecting 11492 /opt/intel/openvino/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:71
Aborted
通过基本数学运算,我的笔记本电脑上输出Blob的字节大小与节点1x18x13x13x4 = 12168字节的大小匹配,但在树莓派pi3上,网络Blob大小预期为1x17x13x13x4 = 11492字节。
但是,如果我使用python API在同一树莓派pi3和NCS2组合上从我的模型运行推理,则该模型可以正常工作。树莓派3的C ++ API是否缺少我想要的东西。