torch2trt
torch2trt copied to clipboard
[Bug] `torch.squeeze` fails to squeeze negative dimensions.
The torch.squeeze converter appears to miss squeezing dimensions when the squeezed dimension is negative.
eg.
import logging
import tensorrt
import torch
import torch2trt
logging.basicConfig(level=logging.INFO)
torch.manual_seed(0)
class SqueezeModule(torch.nn.Module):
def forward(self, t: torch.Tensor):
return t.squeeze(-1).squeeze(-1)
if __name__ == "__main__":
tensor = torch.rand(2, 2, 1, 1).cuda()
model = SqueezeModule().eval().cuda()
model(tensor)
model_trt = torch2trt.torch2trt(
model,
[tensor],
)
out = model(tensor)
out_trt = model_trt(tensor)
assert torch.allclose(out, out_trt), f"Not all close {out} {out_trt}"
print("All close!")
Outputs the following:
AssertionError: Not all close
tensor([[0.4963, 0.7682],
[0.0885, 0.1320]], device='cuda:0')
tensor([[[0.4963],
[0.7682]],
[[0.0885],
[0.1320]]], device='cuda:0')
The TRT output tensor is incorrectly shaped compared to the original output.
I'll put up a fix shortly.