TensorRT
TensorRT copied to clipboard
🐛 [Bug] Error: "Conversion of function torch._ops.aten.aten::cumsum not currently supported!" on CUDA 11.8 and 12.1
Bug Description
In the CI tests, all tests of cumsum was failed on CUDA 11.8 and 12.1, but works on 12.4. The error is like:
FAILED conversion/test_cumsum_aten.py::TestCumsumConverter::test_cumsum_1D_0 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::cumsum not currently supported!
FAILED conversion/test_cumsum_aten.py::TestCumsumConverter::test_cumsum_1D_1 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::cumsum not currently supported!
FAILED conversion/test_cumsum_aten.py::TestCumsumConverter::test_cumsum_1D_2 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::cumsum not currently supported!
...
It also works on my local machine with RTX 4080 + CUDA 12.2.