models
models copied to clipboard
Would like Float16 data type for the input and nodes of the ONNX file in SSD-MobilenetV2 trained on COCO dataset
Feature Request
What is the problem that this feature solves?
When I use the Isaac SDK TensorRT inference using the onnx file from the current SSD-mobilenet onnx file, it says it has unsupported UINT8 data type instead of Float16. I need a pre-trained model with Float16 or Float32 data type. Most of the onnx files and Tensorflow frozen graphs online and on repos are using the UINT8 data type.
Please detail the discrepancy with our current functionality.
Describe the feature
Why is this feature necessary? What does it accomplish? I would like a model with Float16 data type for the input and nodes that is in onnx file format trained using COCO dataset with 91 classes. I would like one that works accurately for object detection using TensorRT inference implemented in Isaac SDK framwork.
Relevant Model Zoo feature areas
Which area in the Model Zoo infrastructure does this impact?
Feature Area (e.g. CI, file organization, model storage, other):
Notes
Any additional information