Nipun Batra
Nipun Batra
Use modAL; use MNIST or cats/dogs like dataset used in earlier issue.
1. for optimising hyperparams of a NN? 2. for any simple blackbox function. Main goal: to show this is better than random sampling and more efficient than grid search.
Create a notebook on Mauna Lua like dataset, or any simple dataset. Show the MLE for extrapolation regions. Then, show how if we used Bayesian regression, we would get uncertainty...
Create a notebook inspired by: https://jramkiss.github.io/2020/07/29/overconfident-nn/ Starter code: https://github.com/nipunbatra/pml-teaching/blob/master/notebooks/cats-dogs.ipynb Use: Hamiltorch and or Laplace Redux for getting epistemic uncertainty. Also, show model trained on MNIST and performance on not-MNIST or...
- implement BART in Torch (with Hamiltorch sampler) - implement BART in JAX (with BlackJax sampler) - Add RW Metropolis Hastings to Hamiltorch - Streamlit equivalent of https://chi-feng.github.io/mcmc-demo/app.html?algorithm=HamiltonianMC&target=banana - Fully...
See my notebook: https://github.com/nipunbatra/pml-teaching/blob/master/notebooks/mcmc-1.ipynb See Hamiltorch documentation and get the example for MLP Bayesian NN, Bayesian CNN for regression and classification working and make similar plots as made earlier in...
in relation to Laplace approx. good resource: 3Blue1Brown video for univariate.
https://jakevdp.github.io/PythonDataScienceHandbook/05.06-linear-regression.html#Gaussian-basis-functions
Here is an excellent talk from ProbAI summer school last year by Yingzhen Li https://www.youtube.com/watch?v=cRzNWVjnD6I Notes: http://yingzhenli.net/home/pdf/ProbAI2022_lecture_note.pdf [[video](https://t.co/dHTWGBgX1x)][[notes](http://yingzhenli.net/home/pdf/ProbAI2022_lecture_note.pdf)][[demo1](https://bit.ly/3zF1zvA)][[demo2](https://bit.ly/3Hd1Ass)]