Dheeraj Peri

Results 121 comments of Dheeraj Peri

Here's what I think could be a simpler way of doing this 1) We probably don't have to store output_shapes in `TorchTensorRTModule` class. Once the compilation is finished, verify if...

The minimal reproducer for this bug: ```py """Test script for TRT export""" import torch import torch_tensorrt from torch.export import export, Dim class TestMod(torch.nn.Module): def __init__(self): super().__init__() self.linear1 = torch.nn.Linear(10,10) self.linear2...

This PR fixes this issue : https://github.com/pytorch/TensorRT/pull/2918

Question to pytorch : https://github.com/pytorch/pytorch/issues/128640

The solution is to not use `max=1024`. `seq_len = torch.export.Dim("seq_len", min=1, max=1023)` works

Related issues: https://github.com/pytorch/TensorRT/issues/2912

@jiwoong-choi Can you provide a model where you have seen this issue ? We have tested some huggingface models and haven't encountered this issue. From my understanding, graphmodule should have...

@jiwoong-choi Sorry for the delay. The `graph_module.graph` here is as follows ```py ... ... ... %layer_norm_24 : [num_users=2] = call_function[target=torch.ops.aten.layer_norm.default](args = (%add_25, [768], %encoder_layer_11_output_layer_norm_weight, %encoder_layer_11_output_layer_norm_bias, 1e-12), kwargs = {}) %slice_5...

> Is this the complete repro code? When I run the above it shows me `TypeError: forward() missing 1 required positional argument: 'L_self_modules_pooler_modules_dense_parameters_bias_'` seems like it is missing some input....