TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

✨[Feature] Autogen TRT Plugins

Open narendasan opened this issue 1 year ago • 0 comments

Is your feature request related to a problem? Please describe.

There are some cases where taking the extra step of wrapping a torch layer or pytorch custom op in a plugin to embed in at TRT engine may improve the performance of the model. However, there is a ton of boilerplate needed to actually access the operator through TensorRT. It would be great if this could get abstracted away for users.

Describe the solution you'd like

Given a functional torch operator and an FakeTensor implementation, autogenerate the TensorRT plugin code to allow that op to be embedded in a TRT engine.

Describe alternatives you've considered

This could be done in C++ as well but may be more complicated than handling this in Python.

Additional context

https://github.com/pytorch/TensorRT/blob/main/examples/dynamo/custom_kernel_plugins.py

narendasan avatar May 06 '24 23:05 narendasan