Bmu laplace estimate
Added a small addition to the point estimates. From a maximum a posteriori approximation the Hessian can be calculated and its inverse is then the covariance matrix of some Gaussian approximation. Also works on multimodal posterior distributions by using multiple starting points for the optimization. The Gaussian components are then weighted according to their relative posterior values.
Currently the output is a DataFrame that holds the MAP estimates and the covariance matrices, as well as a second output with a function handle for the (log-)posterior. Can be enhanced by using @lukasfritsch implementation of Gaussian mixtures. Also would benefit from having some better approximation of the Hessian. I used a finite differences-approximation currently that was adopted from old Matlab-code. But this surely can be enhanced with automatic differentiation or based on FiniteDiff.jl or FiniteDifferences.jl. Then only this line needs to be changed:
hess = [inv(fd_hessian(optimTarget, var, fddist)) for var in eachrow(vars)]
Also lacks documentation right now, but it works very nicely on some examples.
Codecov Report
Attention: Patch coverage is 13.92405% with 68 lines in your changes missing coverage. Please review.
Project coverage is 90.98%. Comparing base (
f5ee6cc) to head (f57d492).
| Files with missing lines | Patch % | Lines |
|---|---|---|
| src/modelupdating/bayesianMAP.jl | 13.92% | 68 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## master #252 +/- ##
==========================================
- Coverage 94.30% 90.98% -3.32%
==========================================
Files 42 42
Lines 1772 1842 +70
==========================================
+ Hits 1671 1676 +5
- Misses 101 166 +65
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
We're most likely going to use Zygote for AD with the GPs. Should be a good candidate for the hessian here as well.
Since we already have a dependency on FiniteDifferences you could call jacobian twice to get the hessian instead of rolling your own implementation.
Maybe write a hessian function that does this and put it in util.
With the updated JointDistribution does it make sense to directly construct a MixtureModel and return the joint distribution?
It does, yes.
I was waiting for the JointDistribution to be merged so that I can use it.