opt_einsum
opt_einsum copied to clipboard
Fink optimal path for caching constant intermediates
For example:
i,ij,jk->k where sizes are [5],[5,100],[100,5], and the second and the third tensor is a constant Then it is a good idea to cache ij,jk->ik
I believe the following will help you: https://dgasmith.github.io/opt_einsum/getting_started/sharing_intermediates/
I meant doing the optimization under the consumption that the constant tensors can be contracted beforehand offline. The above functionality only cache the intermediates, but the path planning might not consider which tensor is a constant
Yes, cache/intermediate aware paths are something we have discussed but have not implemented. It isn't clear that there is a general approach to solve the problem as straightforward approaches become combinatorial in nature.
Happy to take a PR which attempts this functionality!