TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

🐛 [Bug] Error: "Conversion of function torch._ops.aten.aten::cumsum not currently supported!" on CUDA 11.8 and 12.1

Open zewenli98 opened this issue 1 year ago • 0 comments

Bug Description

In the CI tests, all tests of cumsum was failed on CUDA 11.8 and 12.1, but works on 12.4. The error is like:

FAILED conversion/test_cumsum_aten.py::TestCumsumConverter::test_cumsum_1D_0 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::cumsum not currently supported!
FAILED conversion/test_cumsum_aten.py::TestCumsumConverter::test_cumsum_1D_1 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::cumsum not currently supported!
FAILED conversion/test_cumsum_aten.py::TestCumsumConverter::test_cumsum_1D_2 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::cumsum not currently supported!
...

It also works on my local machine with RTX 4080 + CUDA 12.2.

zewenli98 avatar Oct 06 '24 06:10 zewenli98