opt_einsum icon indicating copy to clipboard operation
opt_einsum copied to clipboard

Document state of PyTorch upstreaming in the README

Open bmillwood opened this issue 9 months ago • 2 comments

The README mentions that some of these optimizations already exist upstream in numpy, but that you need to pass optimize=True to access them. This is useful!

The docs for torch.einsum suggest that it automatically uses opt_einsum already if available (see also discussion at https://github.com/dgasmith/opt_einsum/pull/205). It would be helpful to also mention that here, and say whether it's necessary to explicitly import opt_einsum to get this behaviour (I believe no?), potentially also mentioning torch.backends.opt_einsum.is_available() and torch.backends.opt_einsum.enabled, or anything else that seems relevant / useful. (I think the torch docs could also be improved here, and may submit an issue or PR there, but I think it would be useful to say something here regardless.)

Doing something like this for every supported opt_einsum backend might be quite a task, but let's not let perfect be the enemy of good :)

bmillwood avatar May 16 '24 13:05 bmillwood