Providing gradients and automatic differentiation
A lot of the factors' gradients can be analytically computed. For others, automatic differentiation can be done with packages such as Zygote or ForwardDiff.
I'm starting this broad issue to look at what can be done and if it is worth it.
Analytic gradients
- We can have an optionally defined gradient function for every factor with fallback to finite AD
- I tried it, but function lambdas are not currently created to support it. Perhaps we can keep it in mind with the upcoming refactor CCW --- XXX --- CF
Automatic differntiation.
- I tried forward and it has the same DualNumber issue parametric had as values are fixed to
Float64.
Hi Johan,
Ooo,
I tried it, but function lambdas are not currently created to support it
Could you mention a bit more on that please. Part of the #467 transition forces us to use lambdas. I was expecting that lambda definitions would be fine for autodiff tools. You don't mean "lambda's simply don't work with autodiff", right? Or are there particular cases similar to forcing ::Float64 rather than ::Real that makes lambda's tricky but still workable?
I had 2 very distinct problems:
- The way the lambdas are currently built only influences user (or Zygote) provided gradients. It will just have to be changed to support it while still maintaining support for automatic differentiation.
- The
Float64thing is needed for automatic forward differentiation.
xref https://github.com/JuliaRobotics/IncrementalInference.jl/issues/1546
Also see package: https://github.com/SciML/PreallocationTools.jl
This might also influence:
- #1513