Optimization.jl icon indicating copy to clipboard operation
Optimization.jl copied to clipboard

Wrap HyperHessians.jl as an AD backend

Open Vaibhavdixit02 opened this issue 2 years ago • 4 comments

Though it only focuses on hessians, combining with a gradient call from another package and hessian from https://github.com/KristofferC/HyperHessians.jl would be interesting for second order methods.

Vaibhavdixit02 avatar Dec 24 '21 08:12 Vaibhavdixit02

Were there any more thoughts on how this might be best implemented? I was thinking about how to fit this into the interface since I need to use some of these methods. Maybe as an extra field e.g. in AutoForwardDiff to allow for a different Hessian method (e.g.hyper::Bool), though that might be a bit awkward in the code. Perhaps once your comment here https://github.com/SciML/Optimization.jl/issues/314#issuecomment-1177048027 (and Chris' following comment) is resolved down the line this would be very convenient to add.

DanielVandH avatar Jul 08 '22 01:07 DanielVandH

Since it has limitations, it would be nice to have it as an option in AutoForwardDiff, like fasthes = true. We'd also have to setup the manual seeding for sparsity.

ChrisRackauckas avatar Jul 08 '22 22:07 ChrisRackauckas

manual seeding for sparsity

What do you mean with this?

DanielVandH avatar Jul 09 '22 00:07 DanielVandH

The way the sparse coloring is done, https://book.sciml.ai/notes/09/

ChrisRackauckas avatar Jul 09 '22 04:07 ChrisRackauckas