Apurba Bose
Apurba Bose
``` FAILED conversion/test_index_put_aten.py::TestIndexPutConverter::test_bool_mask_test - AssertionError FAILED conversion/test_index_aten.py::TestIndexConstantConverter::test_index_constant_bool_mask_0_mask_index_three_dim - AssertionError FAILED conversion/test_index_aten.py::TestIndexConstantConverter::test_index_constant_bool_mask_1_mask_index_two_dim - AssertionError FAILED conversion/test_index_aten.py::TestIndexConstantConverter::test_index_constant_bool_mask_2_mask_index_multi_axis - AssertionError FAILED conversion/test_reshape_aten.py::TestReshapeConverter::test_reshape_0 - torch_tensorrt.dynamo.conversion._TRTInterpreter.UnsupportedOperatorException: Conversion of function torch._ops.aten.aten::view not currently supported! ```
Torch distributed data parallel accelerate GPT2 example failing ``` cd examples/distributed_inference CUDA_VISIBLE_DEVICES=0 accelerate launch data_parallel_gpt2.py accelerate launch data_parallel_gpt2.py ``` torch 2.9.0.dev20250821+cu129 torch_tensorrt 2.10.0.dev0+0 accelerate 1.10.1
Bug: FAILED llm/test_llm_models.py::test_llm_decoder_layer[FP16] - AssertionError FAILED llm/test_llm_models.py::test_llm_decoder_layer[BF16] - AssertionError TRT 10.13.3.9 Pytorch 2.10.0a0+b558c986e8 Passes on A100. Error: ``` 2025-10-11T04:31:34.722267Z 01O ERROR torch_tensorrt [TensorRT Conversion Context]:logging.py:22 Error Code: 9: Skipping tactic...
Index_put converter has the following cases missing 1. Non consecutive indices ``` param( test_name="nonconsecutive_caseOne", source_tensor=torch.zeros([2,4,4,2], dtype=torch.float32), indices_tensor=(None, torch.tensor([0, 0, 1, 1], dtype = torch.int64), None, torch.tensor([0, 0, 1, 1], dtype=torch.int64)),...