etaf
etaf
Link to landed trunk PR (if applicable): * https://github.com/pytorch/pytorch/pull/128383 Link to release branch PR: * https://github.com/pytorch/pytorch/pull/128614 Criteria Category: * (4) Test/CI fixes. Fix UT and unblock the xpu CI: before...
Link to landed trunk PR (if applicable): * https://github.com/pytorch/pytorch/pull/124842 Link to release branch PR: * https://github.com/pytorch/pytorch/pull/128615 Criteria Category: * Critical fix, forward fix for https://github.com/pytorch/pytorch/issues/126173. Also update Intel triton for...
> Pls. update the PR description. Resolved
> The same comment as the previous PR on splitting into shim_xpu.h. I will review this one after you updated the base PR. @desertfire Thanks, I've splitted the AOTI XPU...
@EikanWang This PR depends on #141114
> > Also, this test is breaking Mac testing, although results were initially occluded by #142206 > > Do you have a test failure link? I wonder what the problem...
The failed job :[xpu / linux-jammy-xpu-2025.0-py3.9 / test (default, 4, 4, linux.idc.xpu)](https://hud.pytorch.org/pr/pytorch/pytorch/140269#34163754963) ([gh](https://github.com/pytorch/pytorch/actions/runs/12245053349/job/34163754963)) inductor/test_torchinductor_opinfo.py::TestInductorOpInfoXPU::test_comprehensive_masked_cumprod_xpu_float16 is a known issue #141861 and has been fixed in main branch by #142348. Please ignore...
@shunting314 Can you also resolve the issue #143479 before reland?
duplicate with https://github.com/pytorch/pytorch/pull/137886
Hi @jansel @desertfire , we’ve recently been working to improve UT converage for the XPU backend. This PR is ready for review and has passed CI. We’d greatly appreciate it...