LinearOperators.jl icon indicating copy to clipboard operation
LinearOperators.jl copied to clipboard

Broadcast LinearOperators

Open tmigot opened this issue 4 years ago • 5 comments

What is the best way to broadcast LinearOperators? For instance,

Jx =  jac_op(nlp, x) # where nlp is an AbstractNLPModel, and x a vector

and then

Jx .= jac_op(nlp, y)

tmigot avatar Jul 17 '21 04:07 tmigot

You can't do that (at least not in the current state of things). Maybe we could cook up jac_op!(J, nlp, x).

dpo avatar Jul 19 '21 14:07 dpo

I think that if you define Jx = jac_op(nlp, x), and change x afterward with .=, then Jx is changed automatically. At least for the default jac_op.

abelsiqueira avatar Jul 19 '21 14:07 abelsiqueira

Is it? That wouldn't really be a desirable side effect.

dpo avatar Jul 19 '21 15:07 dpo

Perhaps we could store the x value inside the operator? Maybe creating a HessianLinearOperator?

abelsiqueira avatar Jul 22 '21 00:07 abelsiqueira

Up this conversation following the discussion https://github.com/JuliaSmoothOptimizers/NLPModels.jl/pull/416

What @abelsiqueira described

I think that if you define Jx = jac_op(nlp, x), and change x afterward with .=, then Jx is changed automatically. At least for the default jac_op.

is still true with LinearOperators v2.

Besides, I am not sure to see how we replace op.prod! because we cannot do

op.prod! = new_prod!

because of this

MethodError: Cannot `convert` an object of type NLPModels.var"#40#41"{Float64, SimpleNLPModel{Float64, Vector{Float64}}, Vector{Float64}, Vector{Float64}, typeof(*), typeof(+)} to an object of type NLPModels.var"#49#50"{Float64, SimpleNLPModel{Float64, Vector{Float64}}, Vector{Float64}, Vector{Float64}, typeof(*), typeof(+)}

tmigot avatar Sep 01 '22 16:09 tmigot