machinelearning icon indicating copy to clipboard operation
machinelearning copied to clipboard

Add TensorRT support

Open turowicz opened this issue 3 years ago • 7 comments

Is your feature request related to a problem? Please describe. Currently the fastest way of executing models for Computer Vision inference is by running a TensorRT-optimised model. It is widely available in C/C++ but you cannot really use it in C#.

Describe the solution you'd like I would like to be able to load the TensorRT engine into C# memory and call it from there using OpenCVSharp's Mat structures.

Describe alternatives you've considered We are currently using Triton Inference Server but it adds overhead time for data serialisation and transmission.

Additional context There are certain scenarios that would benefit greatly from calling a TensorRT model in-process such as Quality Control.

turowicz avatar Jun 01 '22 14:06 turowicz

Hi @turowicz

ML.NET offers the ability to export models to ONNX which from my understanding is one of the supported frameworks.

To export an ML.NET model to ONNX you use the ConvertToOnnx transform.

Here's additional documentation on how to do it as well.

https://docs.microsoft.com/dotnet/machine-learning/how-to-guides/save-load-machine-learning-models-ml-net#save-an-onnx-model-locally

Does that satisfy your requirements?

luisquintanilla avatar Jun 10 '22 22:06 luisquintanilla

I meant the other way round. Load up a TRT model in ML.NET and infer on data.

turowicz avatar Jun 13 '22 08:06 turowicz

What model format are you thinking? Still onxx? and it sounds a large part of the ask is having a way to not have to copy the data, is that correct?

michaelgsharp avatar Jun 13 '22 17:06 michaelgsharp

Model format: TensorRT by NVIDIA

turowicz avatar Jun 13 '22 20:06 turowicz

Load it in C#, run inference. This requires C# externs for TensorRT C runtime

turowicz avatar Jun 13 '22 20:06 turowicz

@luisquintanilla I'll mark this as future for now, but we need to figure out if this aligns with our goals and if so when we would be able to take a look at this.

michaelgsharp avatar Jul 11 '22 17:07 michaelgsharp