OMEinsum.jl icon indicating copy to clipboard operation
OMEinsum.jl copied to clipboard

generalize to arbitrary map-reduce?

Open ssfrr opened this issue 4 years ago • 3 comments

It occurs to me that if you could add functions inside the expression and also specify a different reduction function than +, you could pretty much do arbitrary map-reduce style operations on slices of multidimensional arrays, and it would replace all mapslices, eachslice, as well as many functions that take a dims keyword argument).

I understand that it maybe couldn't fall back to the fast paths, but having a way to generically do these sorts of operations seems like it would have broadcast-level impact and would clean up a lot of the inconsistency with mapslices etc.

@stefankarpinski

ssfrr avatar Feb 26 '20 17:02 ssfrr

I'll also add that as a non-physicist Einstein notation always seemed kind of esoteric, but after playing with it a little it quickly became very intuitive, and I want to use it everywhere. :)

ssfrr avatar Feb 26 '20 17:02 ssfrr

So this is almost precisely what my package does (with apologies for the advertising):

using TensorCast, Statistics, LinearAlgebra

T = randn(10,10,5);
@cast E[i,n] := eigen(T[:,:,n]).values[i]  # generalised mapslices

p = rand(3,4);
@reduce q[y] := mean(x) p[x,y] * exp(-p[x,y])  # any reduction function

@reduce r[x,z] := sum(y) conj(p[y,x]) * p[y,z]  lazy  # inefficient p' * p

OMEinsum's fallback method is (if I understand right) essentially a better version of the LazyArrays broadcasting + reduction being used here -- both avoid materializing the 3×4×4 array before summing one dimension. (I say better because I had all sort of performance issues when last I looked, and I'm not sure this works on GPU at all, etc.)

mcabbott avatar Feb 26 '20 22:02 mcabbott

Indeed, that seems very much like what I'm looking for. Thanks for letting me know about it!

ssfrr avatar Feb 27 '20 17:02 ssfrr