xla
xla copied to clipboard
Tracking issue: PyTorch precision upcast issue.
This issue tracks the development of the output upcasting issue present in PyTorch/XLA.
- [ ] Main issue: #6403
- [x] Temporary fix for the benchmarking scripts: #6389
- [x] PyTorch
nn.Module
conversion fix: https://github.com/pytorch/pytorch/pull/117167
Once pytorch#117167 lands, we should revert pytorch/xla#6389, and stop setting XLA_USE_{FP16,BF16}
in the benchmarking script.
cc @miladm @JackCaoG @golechwierowicz