xla
xla copied to clipboard
Add dynamo expand test.
This PR adds a test for #5837. The fix was introduced in PyTorch main repository (https://github.com/pytorch/pytorch/pull/121007), but we need PyTorch/XLA for actually exercising the (previously) failing test case.
cc @miladm @JackCaoG
You can add a .torch_pin to build with your patch PR. Following the instructions here: https://github.com/pytorch/xla/tree/master/torch_patches
Such that we don't need to add any guards. We just need to land the two patches together.
Great, looks like all tests pass! Please ping me once your upstream PR is approved. Also, we need to remove the .torch_pin before landing the PR.