IncrementalInference.jl icon indicating copy to clipboard operation
IncrementalInference.jl copied to clipboard

Providing gradients and automatic differentiation

Open Affie opened this issue 4 years ago • 5 comments

A lot of the factors' gradients can be analytically computed. For others, automatic differentiation can be done with packages such as Zygote or ForwardDiff.

I'm starting this broad issue to look at what can be done and if it is worth it.

Analytic gradients

  • We can have an optionally defined gradient function for every factor with fallback to finite AD
  • I tried it, but function lambdas are not currently created to support it. Perhaps we can keep it in mind with the upcoming refactor CCW --- XXX --- CF

Automatic differntiation.

  • I tried forward and it has the same DualNumber issue parametric had as values are fixed to Float64.

Affie avatar Feb 24 '21 07:02 Affie

Hi Johan,

Ooo,

I tried it, but function lambdas are not currently created to support it

Could you mention a bit more on that please. Part of the #467 transition forces us to use lambdas. I was expecting that lambda definitions would be fine for autodiff tools. You don't mean "lambda's simply don't work with autodiff", right? Or are there particular cases similar to forcing ::Float64 rather than ::Real that makes lambda's tricky but still workable?

dehann avatar Feb 24 '21 19:02 dehann

I had 2 very distinct problems:

  • The way the lambdas are currently built only influences user (or Zygote) provided gradients. It will just have to be changed to support it while still maintaining support for automatic differentiation.
  • The Float64 thing is needed for automatic forward differentiation.

Affie avatar Feb 25 '21 06:02 Affie

xref https://github.com/JuliaRobotics/IncrementalInference.jl/issues/1546

Affie avatar Jul 05 '22 14:07 Affie

Also see package: https://github.com/SciML/PreallocationTools.jl

Affie avatar Jul 05 '22 14:07 Affie

This might also influence:

  • #1513

dehann avatar Jul 05 '22 19:07 dehann