torch2trt icon indicating copy to clipboard operation
torch2trt copied to clipboard

[Bug] `torch.squeeze` fails to squeeze negative dimensions.

Open chaoz-dev opened this issue 2 years ago • 1 comments

The torch.squeeze converter appears to miss squeezing dimensions when the squeezed dimension is negative.

eg.

  import logging
  import tensorrt
  import torch
  import torch2trt


  logging.basicConfig(level=logging.INFO)
  torch.manual_seed(0)


  class SqueezeModule(torch.nn.Module):
      def forward(self, t: torch.Tensor):
          return t.squeeze(-1).squeeze(-1)


  if __name__ == "__main__":
      tensor = torch.rand(2, 2, 1, 1).cuda()

      model = SqueezeModule().eval().cuda()
      model(tensor)

      model_trt = torch2trt.torch2trt(
          model,
          [tensor],
      )

      out = model(tensor)
      out_trt = model_trt(tensor)

      assert torch.allclose(out, out_trt), f"Not all close {out} {out_trt}"
      print("All close!")

Outputs the following:

AssertionError: Not all close
tensor([[0.4963, 0.7682],
        [0.0885, 0.1320]], device='cuda:0')
tensor([[[0.4963],
         [0.7682]],

        [[0.0885],
         [0.1320]]], device='cuda:0')

The TRT output tensor is incorrectly shaped compared to the original output.

chaoz-dev avatar Feb 08 '23 19:02 chaoz-dev

I'll put up a fix shortly.

chaoz-dev avatar Feb 08 '23 19:02 chaoz-dev