xla
xla copied to clipboard
AllReduce XLA Token support
❓ Questions and Help
https://github.com/pytorch/xla/blob/master/torch_xla/csrc/init_python_bindings.cpp#L349 I was wondering if XLA AllReduce() already supports the standard xla token? I'm trying to overlap the computation and communication of torch xla in a distributed training, which requires splitting of allreduce at the xla layer, and I'm considering what method to use to guarantee the order of allreduce.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.