tensorrt-yolov6
tensorrt-yolov6 copied to clipboard
Unable to parse ONNX model file: yolov6s.onnx (ERROR: ModelImporter.cpp:296 In function importModel: [5] Assertion failed: tensors.count(input_name))
Hi,
I could compile the code, but following is the runtime error,
&&&& RUNNING TensorRT.sample_yolo # D:\Sushil_Projects\DL_Projects\YoloV6\YoloV6_TensorRT\YoloV6_TensorRT\x64\Release\YoloV6_TensorRT.exe [07/11/2022-14:57:42] [I] Building and running a GPU inference engine for Yolo [07/11/2022-14:57:42] [I] Parsing ONNX file: yolov6s.onnx WARNING: ONNX model has a newer ir_version (0.0.6) than this parser was built against (0.0.3). While parsing node number 0 [Conv]: ERROR: ModelImporter.cpp:296 In function importModel: [5] Assertion failed: tensors.count(input_name) [07/11/2022-14:57:42] [E] Unable to parse ONNX model file: yolov6s.onnx &&&& FAILED TensorRT.sample_yolo
I have downloaded the yolov6s.onnx from https://github.com/meituan/YOLOv6/releases/tag/0.1.0
In the code--> initializeSampleParams function is as follows,
SampleYoloParams initializeSampleParams(std::vectorstd::string args) { SampleYoloParams params;
// The engine file to generate or to load
// The engine file does not exist:
// This program will try to load onnx file and convert onnx into engine
// The engine file exists:
// This program will load the engine file directly
params.engingFileName = "yolov6s.engine";
// The onnx file to load
params.onnxFileName = "yolov6s.onnx";
// Input tensor name of ONNX file & engine file
params.inputTensorNames.push_back("image_arrays");
// Old batch configuration, it is zero if explicitBatch flag is true for the tensorrt engine
// May be deprecated in the future
params.batchSize = 0;
params.outputClsSize = 80;
// Threshold values
params.confThreshold = 0.3;
params.nmsThreshold = 0.5;
// Batch size, you can modify to other batch size values if needed
params.explicitBatchSize = 1;
params.width = 640;
params.height = 640;
params.inputVideoName = "test.mp4";
params.cocoClassNamesFileName = "coco.names";
for (auto& arg : args)
{
params.demo = 1;
params.outputImageName = "demo_out.jpg";
if (arg == "--int8")
{
params.int8 = true;
}
else if (arg == "--fp16")
{
params.fp16 = true;
}
}
specifyInputAndOutputNamesAndShapes(params);
return params;
}
Please let me know your comments,
Thanks
I will take a look.
[5] Assertion failed: tensors.count(input_name) Error is related to the mismatch between onnx versions. Please check out this issue
https://github.com/onnx/onnx-tensorrt/issues/302
And
Will you download https://github.com/meituan/YOLOv6/releases/download/0.1.0/yolov6t.onnx this version and try it again?
What is your tensorrt version?
It works fine on my machine

[5] Assertion failed: tensors.count(input_name)Error is related to the mismatch between onnx versions. Please check out this issue onnx/onnx-tensorrt#302 And Will you download https://github.com/meituan/YOLOv6/releases/download/0.1.0/yolov6t.onnx this version and try it again?
I tried with yolov6t.onnx but problem remain same. My tensorRT version is TensorRT-7.2.2.3
CUDA Version: 11.1
Will you put the exact location of yolov6s.onnx?
I mean the absolute path for your yolo model:
// The onnx file to load
params.onnxFileName = your path to yolov6s.onnx;
of course, yolov6s.onnx is right path otherwise there is following error,

ok I will test it with TensorRT-7.2.2.3 version and let you know
ok I will test it with TensorRT-7.2.2.3 version and let you know
which version do you use?
TensorRT-7.2.1.6
I tried to compile with TensorRT-7.2.1.6 but problem remain same.
