torch2trt icon indicating copy to clipboard operation
torch2trt copied to clipboard

An easy to use PyTorch to TensorRT converter

Results 235 torch2trt issues
Sort by recently updated
recently updated
newest added

Hi, - with `tensorrt_converter` for instance norm 3D, the result was abnormal - Although verification by `pytest` passes, instance norm with the default initialization. random initialization will be failed. https://github.com/NVIDIA-AI-IOT/torch2trt/blob/master/torch2trt/converters/native_converters.py?plain=1#L902...

When I run the instruction: sudo python3 setup.py install --plugins I get an error: Illegal instruction.

Hi all, when I run conversion of the following operation: ```python x_padded = torch.nn.functional.pad(x, (0, 0, pad_left, pad_right)) ``` I get the error below: ```bash AttributeError: 'tensorrt_bindings.tensorrt.INetworkDefinition' object has no...

I have a jetson Orin Nano model. I've tried to install torch2trt in my jetson device using docker system. Here is my error log during installation. root@ubuntu:/home/torch2trt# python3 setup.py install...

Because how flattener limits to floats as inputs, it will wrongly remove any int inputs that are common in text models e.g. encoded token ids. Discovered when trying to convert...

``` import torch from transformers import AutoImageProcessor, AutoModelForZeroShotImageClassification, AutoTokenizer, ZeroShotImageClassificationPipeline, SiglipProcessor, SiglipModel from torch2trt import torch2trt model = SiglipModel.from_pretrained('google/siglip-large-patch16-384', torch_dtype=torch.float16).cuda() text_model = model.text_model dummy = torch.ones(1, 64, dtype=torch.long, device='cuda') text_model(dummy)...

Hello. SyntaxWarning appeared when I imported torch2trt. ``` C:\DeepLearning\torch2trt-master\torch2trt\dataset.py:61: SyntaxWarning: assertion is always true, perhaps remove parentheses? assert(len(self) > 0, 'Cannot create default flattener without input data.') ``` I think...

before operator converting, some input tensors had attributes called '_trt' ![屏幕截图(240)](https://github.com/NVIDIA-AI-IOT/torch2trt/assets/137691495/d990a632-9db2-4327-bfc0-52ee71448012) To deal with it, I've deleted these incorrect '_trt' attributes manually in related converter function in “naive_converters.py”. However, I've...

I'd like to save my model into TRT file. Could you please provide a sample code?