`autodiff`: add support for Jacobian and Hessian matrices.
Currently only gradients can be computed via the RustQuant::autodiff module.
Adding support for full Jacobians and also higher-order derivatives like the Hessian would be nice.
@avhz Hi, I am not familiar with the codebase but I will try to figure something out in the next couple of days. Any specific reason why this issue is considered difficult?
Jacobian matrices shouldn't be too bad to implement (just a collection of gradients) but I have not figured out higher order derivatives. From what I've read it's a lot more work.
I have looked through and I am interested in tackling this issue, but will need a bit of time, since I just recently started playing with Rust and have plenty of knowledge gaps to fill...
- I have found a description of an autodiff method you are implementing. Is your implementation from scratch or is based on something else? If you have some specific resources, please mention them, that would be very helpful.
- It looks like the Jacobian can be realised with the existing
.accumulatemethod onVariableArrayinstance. I was not able to verify that it works yet though. - Seems that Hessian should be safely done as a double pass of accumulate, but I haven't thought of how to implement that in practice yet.
The two best references I know of are:
- Evaluating Derivatives - Griewank & Walther
- The Art of Differentiating Computer Programs - Naumann
The VariableArray object is really a WIP and does not actually work as intended, since we can't fill ndarrays (nor nalgebra matrices/vectors) with the Variable type.
Applying accumulate to a vector output function should be all that needs to happen, just with a nice/intuitive API.
The last point about the Hessian is what would be ideal, but since the current methods return f64s not Variables, we can't apply it twice.
For Hessian accumulation, there are 3 modes: forward-over-reverse, reverse- over-forward, and reverse-over-reverse.
We would have to do the last, since forward is not implemented.