Frank Liu

Results 16 comments of Frank Liu

just a update that i am still on it, the first PR will be out soon

i found a fundamental problem of implementing the XLA generator. The generator base class https://github.com/pytorch/pytorch/blob/main/c10/core/GeneratorImpl.h assumes the generator's state has `seed` and `offset`. there are getter and setter methods for...

I am also curious about which versions and configurations of pytorch will it pins to. Currently, the pytorch/xla assumes the HEAD of main. some features rely on the "unreleased" code...

A key advantage of PyTorch/XLA has always been the performance gains achieved by optimizing the computation graph as a whole. With Eager Mode becoming the default alongside true just-in-time compilation,...

> if you don't materialize any tensors in eager mode in most cases it will defer everything and send an entire computational graph to XLA. how can i don't materialize...

i suggest to check in https://github.com/pytorch/xla/pull/9703 before submitting this PR and keep the two PR separately. I can take a look at https://github.com/pytorch/xla/pull/9703 if you cannot find anyone working on...