Thomas Viehmann

Results 227 comments of Thomas Viehmann

I think this might not be an autocast but more a cat problem: PyTorch will cat fp16 and fp32: ``` In [1]: import torch In [2]: a = torch.randn(2, 2)...

So modifying cat to do upcasting as in PyTorch lets us run the trace, but it would insert casting ops which we would then need to undo in the autocast...

> I get torch.float16 without Thunder enabled and float32 with Thunder So I think we expect the tracing itself to run without autocast and the autocast transform inserting casts as...

Well, we do have SSA form in the Thunder IR until we allow inplace operations, which seems to be one of the things people want to do. The transformation to...

Exactly, or even "let's preserve our SSA form and the fact that dataflow describes the order of operations (and admissable reorderings) even when we want inplace".

I am not sure that "there is a PyTorch programming style that makes it easy for us" helps us that much because silently producing incorrect results for valid PyTorch code...

Seems like #264 would also benefit from an SSA/functionalization pass, as it also deals with implicit state (except that it seems simpler in that we don't need worry about aliasing).

Note that unless you rearrange the mixing over what is commonly implemented, you will have data-dependent control flow.

I don't think it's on the roadmap any time soon.