pytorch icon indicating copy to clipboard operation
pytorch copied to clipboard

Mitigates SWDEV-459618

Open xinyazhang opened this issue 8 months ago • 0 comments

_c10d_functional_autograd::all_to_all_single seems not implemented on ROCm.

Note: another unfixed problem is the mismatching of outputs between torch.ops.aten._scaled_dot_product_flash_attention and _scaled_dot_product_chunk_flash_attention. We need fix both problems to enable this UT.

Fixes SWDEV-459618

xinyazhang avatar Jun 03 '24 16:06 xinyazhang