Alex Rogozhnikov

Results 187 comments of Alex Rogozhnikov

@davidnvq I'm also poking around the same issues (#20, #50), and pre-testing some concepts. I'll try to post some thoughts on how that can be written soon (but no promises!)

@p4perf4ce thanks for thinking about that loudly with examples. I've been poking around with an operation semantic (I've dubbed in `rechunk`), it has some overlap with your suggestion. One critical...

Hi @StephenHogg > Why is there a division step happening? when you use say `rearrange(x, '(i j) -> i j', j=3)`, size of `i` is computed by division of x.shape[0]...

Warning is there, likely you didn't restart kernel ```python In [1]: import torch In [2]: import einops In [3]: from torch import nn In [4]: def f(x): ...: return x.shape[0]...

That's correct - no computations done if sizes of all dimensions are given.

@samansarraf that's unrelated. Can you post in [discussions](https://github.com/arogozhnikov/einops/discussions)?

> will the grad be accumulated to original a_s yet, it will. All operations are normally differentiable

Hi @leoentersthevoid, There is no support for such cases. If those are to appear, they should be separate functions or explicit flags, so no need to wait since you can...

Good question! Let's assume we are talking in pytorch terms. It is common to use `.contiguous()` before using `view` in pytorch, however einops replaces `view` (e.g. with `rearrange`). Einops operations...

I'll keep this question open for reference