roboptim-core icon indicating copy to clipboard operation
roboptim-core copied to clipboard

Implement Hessian matrices for vector-valued functions

Open bchretien opened this issue 10 years ago • 2 comments

If we have f: R^n ---> R^m, then Jac(f) is a m x n matrix, and Hess(f) is a tensor of order 3 (m x n x n). Currently, Hessian matrices are stored as 2D matrices, thus supposing that we deal with scalar-valued functions (m = 1).

Note that Hessian matrices are symmetric, so we could maybe use Eigen's self-adjoint view.

bchretien avatar May 03 '14 13:05 bchretien

Actually if you use the same strategy than impl_gradient then impl_hessian can be represented as matrices. I.e. you consider than non-scalar functions are the concatenation of n scalar functions and you choose in the list using an index (cf. impl_gradient prototype).

I would go for this first. One reason would be that it keeps the interface consistent and use matrices only. Yes, I was thinking of using a better matrix type for so far I didn't implement this.

Another alternative would be to use the Eigen unsupported module for tensors representation. Right now, I don't know enough on this module to really want to rely on it but it is worth mentioning it anyway.

thomas-moulard avatar May 05 '14 04:05 thomas-moulard

I've never tried Eigen's unsupported tensors either. I guess they're still doing lots of work on it, so the API will probably change quite a lot in the near future. And apparently they're adding support for symmetries.

I guess we can indeed start concatenating everything into a 2D matrix, and leave any optimization for later if someone really uses that feature.

bchretien avatar May 05 '14 09:05 bchretien