TensorRT
TensorRT copied to clipboard
PyTorch/TorchScript/FX compiler for NVIDIA GPUs using TensorRT
For the example below, how do I save the compiled model? backend = "torch_tensorrt" tp_model = torch.compile( tp_model, backend=backend, options={ "truncate_long_and_double": True, "enabled_precisions": {torch.float32, torch.float16}, "use_python_runtime": True, "min_block_size": 1, },...
Add support for the jetpack6.2 build. Currently jetpack 6.2 has: cuda: 12.6 python: 3.10 tensorrt: 10.3 DLFW: 24.08 (pytorch: 2.6.0) Jetson now distribute wheels on https://pypi.jetson-ai-lab.dev/ jetpack6.2 wheels: https://pypi.jetson-ai-lab.dev/jp6/cu126
## ❓ Question How wo you export a triton kernel with model to a serialized engine that can be run in c++? ## What you have already tried Read through...
**Is your feature request related to a problem? Please describe.** We need faster turnaround time on PRs being validated in CI. Right now builds take 20mins + 40min - 1.5hrs...
 As you can see, `overview` word in the sidebar is bit misleading, it would help users if it was more descriptive.
## Bug Description Presence of BUILD file in released wheel does break use of package in Bazel with rules-python (https://github.com/bazel-contrib/rules_python/issues/2780). ## To Reproduce Steps to reproduce the behavior: - See...
DLFW NGC container shows this error.
The example script fx/quantized_resnet_test.py in the Torch-TensorRT repository fails to execute due to the use of a deprecated attribute EXPLICIT_PRECISION in the TensorRT Python API. This attribute is no longer...
## Bug Description I trained a ssdlite320_320 mobilenetv3 large with Widerface datasets for face detection task. Here is what I received when running the `torch_tensorrt.compile()`: > (capstone) jetson@jetson-desktop:~/FaceRecognitionSystem/jetson/backend/python$ python test.py...
Exporting a model that uses torch.Categorical().sample to sample from the logits. I currently have a (fixed length) loop within a torch.compile graph that includes sampling from the logits to choose...