OMEinsum.jl icon indicating copy to clipboard operation
OMEinsum.jl copied to clipboard

support `conj` in expressions

Open ssfrr opened this issue 4 years ago • 5 comments

For example, for a Complex inner product the expression should look something like:

x1 = randn(ComplexF64, 10)
x2 = randn(ComplexF64, 10)
@ein y[] := x1[i]' * x2[i]

(I'm actually not sure if this is the right way to represent a scalar output, but you get the idea).

But this throws an error.

LoadError: type Nothing has no field inds
in expression starting at /home/sfr/research_journal/2020/02/16/multichannel_wiener_filter.jmd:353
getproperty(::Nothing, ::Symbol) at Base.jl:20
(::OMEinsum.var"#78#80")(::Nothing) at einsequence.jl:147
_mapreduce(::OMEinsum.var"#78#80", ::typeof(vcat), ::IndexLinear, ::Array{Union{Nothing, OMEinsum.IndexGroup{Int64}},1}) at reduce.jl:309
...

I get the same error with an explicit conj(x1[i]).

ssfrr avatar Feb 25 '20 20:02 ssfrr

You can write @ein y[] := conj(x1)[i] * x2[i], it turns out, or ein"i,i->"(conj(x1), x2). But perhaps not ideal?

mcabbott avatar Feb 25 '20 21:02 mcabbott

When applying the function to the array name I'd want to write it conj.(x1)[i] and in my mental model it's just applying a function to the whole array before doing the einsum magic, but it seems like from a performance standpoint you'd want it to be happening in the loop rather than acting on a copy.

Thanks for the tip! It's great there's a way to do this, and it doesn't need to be ideal. Seems like it would be a bit of work on the @ein parser to identify function calls (and/or the ' syntax for adjoint) and move them inside the loop.

ssfrr avatar Feb 26 '20 16:02 ssfrr

It would indeed by nice to apply elementwise-functions in the loops and/or special dispatches but while it would be easy-ish for the macro-syntax and dispatches to our custom loops, it wouldn't work for the string-macro form (ein"i,i->"(conj(x1),xs)) since conj(x1) would evaluate before the einsum-functionality is called which would go against my intend to have the string-macro, macro and regular function call work basically the same. It would also add complexity to the custom dispatches.

If you work with large tensors I'd think the O(N) element-wise operations should generally be negligible compared to the e.g. O(N^3) tensor-contraction operations. If the memory is a problem, you might wanna do the inplace-version of conj! although that'd conflict with AD-functionality if you need that.

under-Peter avatar Feb 26 '20 16:02 under-Peter

oh, I just realized that the reason that conj works is that it is doing the elementwise conj before applying the einsum magic. It seems that conj has a vector method defined.

I just ran across a similar issue trying to use abs2, which does not have a ::Vector method defined, and using dot-notation worked well.

As far as I know the elementwise operations aren't a bottleneck for me right now, so this is definitely not high-priority. Just to throw out my perspective though - I don't always work with large tensors, I'm starting to want to use @ein for all kinds of things, most of which probably aren't tensor contractions. It's just really nice notation for munging multidimensional data. :)

ssfrr avatar Feb 26 '20 17:02 ssfrr

I guess you could have notation like ein"i*,i->"(x,y) == dot(x,y) == @ein x[i]' * y[i], or something? BLAS stuff all has methods for conjugated / transposed / normal, but adjusting this package to propagate this extra info sounds like a fairly big job. (It seems like none of the python einsum flavous care much about complex numbers.)

For things that aren't matrix multiplication, BTW, you may want my TensorCast which allows @reduce z := sum(i) x[i]' * y[i].

mcabbott avatar Feb 26 '20 19:02 mcabbott

conj is not a part of einsum, TensorCast should be the correct place to go.

GiggleLiu avatar Sep 23 '23 19:09 GiggleLiu