Optim.jl icon indicating copy to clipboard operation
Optim.jl copied to clipboard

BHHH for Likelihood Optimization

Open ParadaCarleton opened this issue 2 years ago • 3 comments

BHHH is a second-order algorithm that (conceptually) uses the self-outer-product of the gradient to approximate the Hessian. This is justified by the information matrix equality in statistics, which states that E(x * x') = E(hessian(x)), making the self-outer-product an unbiased and consistent estimator of the Hessian (that can usually be calculated much more easily than the full Hessian). This method is widely used in statistics.

Is there an implementation of BHHH in Optim.jl, or are there any plans to add it?

ParadaCarleton avatar Oct 16 '23 18:10 ParadaCarleton

I'm well aware of BHHH, but I'm not sure what you would want beyond the Newton method? Is it because you want Optim to automatically write the outer product of the score using AD?

pkofod avatar Dec 12 '23 08:12 pkofod

Yep!

ParadaCarleton avatar Dec 12 '23 20:12 ParadaCarleton

Okay, then I suppose you'd have to a) have a vector objective type that can then be interpreted according to some aggregation (I suppose a sum here because you'd have likelihood contributions in your use case) or simply a uhhh-constructor that constructs a normal objective type?

pkofod avatar Jan 26 '24 13:01 pkofod