scarlet icon indicating copy to clipboard operation
scarlet copied to clipboard

Compute Hessian for error estimates

Open pmelchior opened this issue 5 years ago • 1 comments

Currently, we use the V matrix from adam as inverse variance, but it's just the gradient squared. Autograd can compute the Hessian. We should implement/offer it for error estimation. It might be very expensive, in which case we should make it optional somehow.

pmelchior avatar Dec 12 '19 11:12 pmelchior

autograd's hessian only works for a single-parameter model. Other autodiff frameworks may not even have a Hessian implementation.

However, a cheaper and viable way is the approximation of the Hessian by the product of Jacobians (details here).

This requires access to the f_k, i.e. the properly normalized deviation of the rendered model from the observations. Only Observation can provide that. Therefore a new method Observation.get_elementwise_loss (or so) is needed that returns the same ingredients as get_loss (https://github.com/pmelchior/scarlet/blob/master/scarlet/observation.py#L173), but without summing over all pixels. Blend needs to accumulate those for all observations.

I'm also starting to think that we should rename loss into log_likelihood, for clarity.

The first-order Hessian can then be computed by autograd.elementwise_grad or jacobian (I'm not sure which). From the Hessian it's the usual procedure: invert, select diagonals, sqrt. As this is an expensive operation, I'd prefer having the user request it through a new method Blend.estimate_errors, which would set the std property of all parameters.

pmelchior avatar Jan 15 '20 19:01 pmelchior