C++ 中的推理预训练 ONNX 模型?

时间:2021-07-01 05:52:14

标签: c++ tensorflow onnx onnxruntime

我正在尝试重新创建此视频中完成的工作,CppDay20Interoperable AI: ONNX & ONNXRuntime in C++ (M. Arena, M.Verasani)。演示代码的 github 存储库是 here . 到目前为止,我已经使用 TensorFlow 训练了一个回归模型,并已转换为 ONNX 以在 c++ 中进行推理。但是创建的 ONNX 运行时会话无法读取我模型的输入形状;输入形状返回值 -1。

Ort::Env env;
Ort::Session session{env,model_path, Ort::SessionOptions{} };

Ort::AllocatorWithDefaultOptions allocator;
auto* inputName = session.GetInputName(0, allocator);
std::cout << "Input name: " << inputName << "\n";
auto* outputName = session.GetOutputName(0, allocator);
std::cout << "Output name: " << outputName << "\n";
auto inputShape = session.GetInputTypeInfo(0).GetTensorTypeAndShapeInfo().GetShape();
//model has 5 inputs
std::vector<float> inputValues = {1, 2, 3, 4, 5 }; 

// where to allocate the tensors
auto memoryInfo = Ort::MemoryInfo::CreateCpu(OrtDeviceAllocator, OrtMemTypeCPU);

// create the input tensor (this is not a deep copy!)
auto inputOnnxTensor = Ort::Value::CreateTensor<float>(memoryInfo, 
    inputValues.data(), inputValues.size(), 
    inputShape.data(), inputShape.size());
    
// the API needs the array of inputs you set and the array of outputs you get
array inputNames = { inputName };
array outputNames = { outputName };

// finally run the inference!
auto outputValues = session.Run(
    Ort::RunOptions{ nullptr }, // e.g. set a verbosity level only for this run
    inputNames.data(), &inputOnnxTensor, 1, // input to set
    outputNames.data(), 1);                 

输出:

Number of model inputs: 1
Number of model outputs: 1
Input name: input_1
Output name: Identity
tried creating tensor with negative value in shape

有什么建议可以使推理代码正常工作吗?

0 个答案:

没有答案