mlx
mlx copied to clipboard
Support for einops
Hi :)
I am a big fan of einops and I believe that supporting it within the framework would be highly valuable, especially for quick prototyping and situations where performance isn't the primary focus. Are there any plans to implement a feature akin to torch.einsum within the framework? If there's a plan for something like mlx.einsum and contributions are welcome, I'd be eager to contribute.
It seems that Apple MLX support was already proposed within einops.
I second that. einops saved me quite a few times where the logic was becoming a bit intractable.
We'll definitely keep this on the feature roadmap. I think if we had an einsum like NumPY integration with einops is doable. (Contributions welcome!)
Will also need a tile op to support einops
After #278 and #240 get merged, that should be all thats needed to create a backend for einops
Einops maintainer here.
Simplest way to get einops functions for framework is to implement array api.
einops.array_api
works with all array-api compatible tensors (there is also support for scikit-learn
and eindex
).
It also simplifies prioritization of ops a lot for framework devs, as what's in the standard is a bare minimum.
We can close this as the latest MLX works with einops.array_api
back-end.