Jiewen Tan
Jiewen Tan
> Do you need this pr in 2.3? Yea, will also need a couple for the TODOs.
Can I get any reviews?
> I still think we should refactor `convert_torch_dtype_to_jax` and invesgate bf16(which I assume most people will use), approve to unblock. Yea, for sure. Let me follow up with that.
@JackCaoG No, I don't think there is anyone working on it at this moment.
@windmaple You need to install the nightly torch-xla and torch.
@ManfeiBai Can you take a look?
@windmaple Here is the instructions to install nightly: https://github.com/pytorch/xla#available-docker-images-and-wheels
@ManfeiBai Can you try reproducing it?
@PawKanarek What's your libtpu version?
@windmaple Yea, usually you just need nightly for both pytorch and pytorch/xla. pytorch/xla heavily depends on pytorch.