[WIP] Create specific operator for hprod
First jab at avoiding the implicit change of hess_op is up for discussion.
To use it on a code like Trunk, instead of writing H = hess_op!(nlp, x, Hv), we will just write H.x .= x. We can probably wrap that under update!(H, x), but it does not seem necessary.
Issues:
- Verbose
- There is currently no way to pass an existing
xto create anHprodOperator, but that could be changed, if we want to. The API is becoming too convoluted, though. - The
hess_op!withrows, cols, valswill still suffer from the same implicit issue, unless we also create aHprodOperatorfor it.
I don't like the H.x .= x since for the following type of operator that wouldn't do anything:
function hess_op!(
nlp::GridapPDENLPModel,
x::AbstractVector,
Hv::AbstractVector;
obj_weight::Real = one(eltype(x)),
)
@lencheck nlp.meta.nvar x Hv
rows = nlp.pdemeta.Hrows
cols = nlp.pdemeta.Hcols
vals = hess_coord!(nlp, x, nlp.workspace.Hvals, obj_weight = obj_weight)
return hess_op!(nlp, rows, cols, vals, Hv)
end
This is looking good, thanks!
The hess_op! with rows, cols, vals will still suffer from the same implicit issue, unless we also create a HprodOperator for it.
Could it be another constructor?
Ok, so if I understand. Whenever we reimplemented hess_op!, we would now have to specialize HprodOperator!(nlp, x, Hv) and update!(::ModelOperator, x). Is that right?
If we want to reuse the structure for another nlp of the same size, we would also need a update!(::ModelOperator, nlp) and a way to broadcast nlp assuming they respect some conditions.
Some comments on the comments
I don't like the H.x .= x since for the following type of operator that wouldn't do anything: ... Ok, so if I understand. Whenever we reimplemented hess_op!, we would now have to specialize
HprodOperator!(nlp, x, Hv)andupdate!(::ModelOperator, x). Is that right?
Yes, I think it will be necessary, but only models that feel the need to update hess_op! itself. They would, instead, update HprodOperator, so the additional work is the same (well, almost the same).
If we want to reuse the structure for another
nlpof the same size, we would also need aupdate!(::ModelOperator, nlp)and a way to broadcastnlpassuming they respect some conditions.
Maybe we need a update!(::Solver, nlp). Consider the following:
solver = TrunkSolver(nlp) # solver.H exists
nlp = Problem1()
output = solve!(solver, nlp)
# do things
nlp = Problem2()
### Option 1: Update solver.H ???
### Option 2: Update solver
output = solve!(solver, nlp)
Option 1 requires specific knowledge of the solver. Option 2 can be a no-op for many solvers. Also, we can accept keyword arguments to update other solver aspects, like subsolver.
@dpo
The hess_op! with rows, cols, vals will still suffer from the same implicit issue, unless we also create a HprodOperator for it. Could it be another constructor?
Not like this, because ModelOperator stores x. We would need to store rows, cols, vals, I think.
Do we have any use cases for this already? I think our solvers using rows, cols, vals are all factorization.
Codecov Report
Base: 99.50% // Head: 98.66% // Decreases project coverage by -0.83% :warning:
Coverage data is based on head (
7ceda76) compared to base (d6c1a43). Patch coverage: 75.00% of modified lines in pull request are covered.
Additional details and impacted files
@@ Coverage Diff @@
## main #423 +/- ##
==========================================
- Coverage 99.50% 98.66% -0.84%
==========================================
Files 13 14 +1
Lines 801 823 +22
==========================================
+ Hits 797 812 +15
- Misses 4 11 +7
| Impacted Files | Coverage Δ | |
|---|---|---|
| src/NLPModels.jl | 100.00% <ø> (ø) |
|
| src/nlp/operator.jl | 74.07% <74.07%> (ø) |
|
| src/nlp/api.jl | 99.75% <100.00%> (-0.01%) |
:arrow_down: |
| src/nlp/meta.jl | 100.00% <0.00%> (ø) |
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.