geoffroyleconte

Results 28 comments of geoffroyleconte

Should I modify ```julia function jprod!( nlp::AbstractNLPModel, rows::AbstractVector{

Hi! With v2.0 we updated the way to create operators, so that it is generic with matrix-vector products using `mul!` from LinearAlgebra. You can find an example here: https://juliasmoothoptimizers.github.io/LinearOperators.jl/stable/tutorial/#Using-functions ....

I you want a quick fix you could try something like ```julia function prod_op!(res, x, α, β) if β == 0 res .= prod(x) .* α else res .= prod(x)...

Yes, but that was in case we would want to keep the current solution for using warm start later for example.

@amontoison do you know the origin of the failure with Krylov? It looks like it's some failure with Aqua, but I don't know why this is related to this PR.

I needed to remove the constructor `LinearOperator{T}(.....)` and replace it with `LinearOperator(::Type{T}, ....) where {T}` because it would not work for `LinearOperator5{T}(......)`, but this is a breaking change.

The error is probably due to this line: https://github.com/JuliaSmoothOptimizers/LinearOperators.jl/blob/22cf605514369b52e70100501d0f92dd50fcccc6/src/operations.jl#L119 What are the storage stypes of `H` and `cg_op`?

Yes, where is `hess_op!` defined? I can't find it in ADNLPModels.

Get the diagonal elements of a CUDA sparse matrix (see also https://github.com/JuliaSmoothOptimizers/RipQP.jl/blob/51e928f42beb4af86b506339799cc653e8198ec4/src/gpu_utils.jl#L50), but could probably be improved (I only used it with the jacobi preconditionner).

I think something like this might work (K2 with GMRES, no presolve, no scaling and easy stopping tolerances), but I've not done tests that worked since CUDA 3 if I...