Results 32 comments of Wonjoo Lee

Hm seems like we can't codegen `all.dim` yet, the generated `LazyIr.h` for `all.dim` looks like: ``` class AllDim : public XlaNode { public: static torch::lazy::OpKind ClassOpKind() { return torch::lazy::OpKind(at::aten::all); }...

@JackCaoG, this should be ready for review. Thanks!

Couple of `clamp` unit tests failing: ``` ERROR: test_clamp_xla_float32 (__main__.TestDevicePrecisionXLA) ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 390, in instantiated_test raise rte File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 377, in instantiated_test...

The error was due to not having promotion logic for two tensors in the shape inference and lowering functions. Latest commit fixes this. All tests are now passing locally.

> @wonjoolee95 [this change](https://github.com/pytorch/xla/commit/15f12393df9ca4d46ea71a2eeda857942790faee) is breaking test in this CI. Hmm that PR should actually fix the error shown on the CI, could you try doing a quick rebase on...

Just FYI, I've been observing the `LsbMask` error for a while and haven't been able to resolve it. FWIW, I noticed that another user has opened a related ticket in...

Thanks for the ideas, I'll spend some time to see if we can call `GetXlaShape(value)` throughout our code and drop `XlaValue` if possible -- this seems like a cleaner solution...

Couple of tests failing with: `RuntimeError: Lazy tensor backend not registered`. This error seems to be thrown here -- https://github.com/pytorch/pytorch/blob/master/torch/csrc/lazy/backend/backend_interface.cpp#L16. I can reproduce the error locally through python idle.

> Question question: I notice in the class Flip there is a private member `::std::vector dims;`. For me, most of the argument I passed to a XlaNode constructor are converted...

The upstream PR's base branch is a bit outdated. Left a comment to rebase the PR.