ChainRules.jl
ChainRules.jl copied to clipboard
make 3-arg dot rrule partially lazy
This addresses #788. I had to remove the projection to make it work otherwise I get the following error due to a missing projection method. Projecting the lazy array to a dense array when A is dense partially defeats the purpose of this PR so I am leaving it up to the review process to decide what to do here. I can define a projection method if that's preferred.
julia> show(err)
1-element ExceptionStack:
MethodError: no method matching (::ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}})(::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{2}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, typeof(*), Tuple{Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, typeof(*), Tuple{Float64, Vector{Float64}}}, Adjoint{Float64, Vector{Float64}}}})
Closest candidates are:
(::ChainRulesCore.ProjectTo{T})(::ChainRulesCore.NotImplemented) where T
@ ChainRulesCore ~/.julia/packages/ChainRulesCore/zgT0R/src/projection.jl:121
(::ChainRulesCore.ProjectTo)(::ChainRulesCore.Thunk)
@ ChainRulesCore ~/.julia/packages/ChainRulesCore/zgT0R/src/projection.jl:124
(::ChainRulesCore.ProjectTo{AbstractArray})(::Number)
@ ChainRulesCore ~/.julia/packages/ChainRulesCore/zgT0R/src/projection.jl:253
...
Stacktrace:
[1] (::ChainRules.var"#1966#1970"{Adjoint{Float64, Vector{Float64}}, Float64, Vector{Float64}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}}})()
@ ChainRules ~/.julia/dev/ChainRules/src/rulesets/LinearAlgebra/dense.jl:39
[2] unthunk
@ ~/.julia/packages/ChainRulesCore/zgT0R/src/tangent_types/thunks.jl:204 [inlined]
[3] wrap_chainrules_output
@ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:110 [inlined]
[4] map
@ ./tuple.jl:293 [inlined]
[5] map
@ ./tuple.jl:294 [inlined]
[6] wrap_chainrules_output
@ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:111 [inlined]
[7] ZBack
@ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:211 [inlined]
[8] (::Zygote.var"#75#76"{Zygote.ZBack{ChainRules.var"#dot_pullback#1968"{Vector{Float64}, Matrix{Float64}, Vector{Float64}, Vector{Float64}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}}}}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}}}}}}})(Δ::Float64)
@ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:91
[9] top-level scope
@ REPL[6]:1
This is related to the discussion in https://github.com/FluxML/Zygote.jl/issues/1507.
This is also technically a breaking change if we don't project.
We may need to teach LazyArrays about projections, so that they project lazily?
I like the idea of lazy projections. Maybe that would also provide a way to opt out of the projection at the gradient level by calling a get_projection_parent function to undo the last projection.
hello, are there any updates on this? I really need to compute the gradient of dot(x, A, y), with x and y dense vectors and A sparse matrix. It currently converts the matrix into a dense one, which is very inefficient for large matrices.
I suppose you could just define your own mydot function for now and define a rule following this PR.
Maybe even create a package LazyLinearAlgebra that defines most linalg operations in a lazy fashion and their lazy rules.
What makes so difficult to directly apply the rrule on the standard dot function? Sorry for the simple question, but I'm not so familiar with automatic differentiation. I just need it to compute expectation values of an operator.
problem is it is generally required to project the tangents back down onto the tangent space once it is computed. Explaining why is a little complex but it is probably in the docs. But it shows up with things like a the tangent to a real number that has been multiplied by a complex number must be real, and structured matrixes (like diagonal matrixes) need to also follow that structure.
Anyway, that operation needs to also be done lazily if we want to return a lazy array from the 3 arg dot rrule.
Which can be done, but it requires overloading ProjectTo on LazyArray