Manopt.jl icon indicating copy to clipboard operation
Manopt.jl copied to clipboard

Introduce a `get_differential`

Open kellertuer opened this issue 5 months ago • 3 comments

There is a few places where a get_differential(M, obj, p, X) to compute $Df(p)[X]$ directly might be beneficial compared to evaluating the inner product with the gradient, especially when a certain cost obj provides easier access to this.

By default it could on a GradientObjective always fall back to calling said inner product with get_gradient.

  • [ ] introduce get_differential for suitable objectives
  • [ ] introduce a default implementation for the cases where it fits
  • [ ] replace occurrences where until now we do inner products with the gradient by this new function, for example in Armijo and Wolfe condition checks, but for example also in check_gradient.

cc @jonas-pueschel

kellertuer avatar Jun 16 '25 08:06 kellertuer

Good idea. For reference, LineSearches.jl already supports this through the function. Also, quite often we need both value of the objective and the differential, and it is often cheaper to compute them both at once. For example AD computes both anyway even when we just need the differential. LineSearches.jl uses ϕdϕ for that.

mateuszbaran avatar Jun 16 '25 10:06 mateuszbaran

I would do that probably similar to the CostGradObjective that can compute both cost and grad at the same time.

kellertuer avatar Jun 16 '25 10:06 kellertuer

I had a better idea to do a GradientDifferentialFunctor (similar to the existing vector ones) that would just be put into gradient places but can to a few things better.

When I find time to twiddle around with this I will know whether the last idea or this one is better ;)

kellertuer avatar Jun 17 '25 13:06 kellertuer