Siyuan Liu

Results 9 issues of Siyuan Liu

Currently in TPU CI, we are installing torch nightly whl before running tests, however, in TPU CI, torch_xla is built against PyTorch HEAD. This would cause some compatibility issues. (e.g....

Similar to the backport request thread in 2.2 release https://github.com/pytorch/xla/issues/6036 The issue is to track 2.3 release backport. For any PRs you want to backport to 2.3, please reply with...

backport_2.3

Update xla pin to HEAD Summary: - Update bazel to 6.5.0 - Rename `PJRT_Structure_Base` to `PJRT_Extension_Base` to accommodate change in XLA.

Similar to #5908 which added trigger for 2.2-rc1 release.

DO_NOT_MERGE_YET

Rebase after pin update https://github.com/pytorch/xla/pull/6677 and #6494 lands

[DO NOT REVIEW UNTIL ALL FAILED CASES ARE FIXED] - Add a new test to capture the return type of `fmod` - Add assertion on dtype in test. Torch and...

Add blockwise quantized dot support for 8-bit and 4-bit weight Test: - Added unit tests

Quantization

- Fix sharding yml file for proper megatron sharding - Add weight processing hook to pad blockwise quantized weight so that the sharded dimension is divisible by the number of...