pyprobml icon indicating copy to clipboard operation
pyprobml copied to clipboard

Python code for "Probabilistic Machine learning" book by Kevin Murphy

Results 41 pyprobml issues
Sort by recently updated
recently updated
newest added

Extend https://github.com/probml/pyprobml/issues/767 by adding rank-1 Gaussian VI to the comparison. The method is described in sec 9.3.5. There is some JAX code in [vb_gauss_lowrank](https://github.com/probml/pyprobml/blob/master/scripts/vb_gauss_lowrank.py). See also [this scirpt](https://github.com/probml/pyprobml/blob/master/scripts/vb_gauss_lowrank_labour_force_demo.py) which reproduces...

# Instructions * [Follow these guidelines](https://github.com/probml/pyprobml/blob/master/notebooks/README.md) ## Chapter 18: Gaussian processes | Figure | Script | Notebook | PR | Author | | :- | :- | :- | :-...

Figures

# Instructions * [Follow these guidelines](https://github.com/probml/pyprobml/blob/master/notebooks/README.md) ## Chapter 17: Bayesian neural networks | Figure | Script | Notebook | PR | Author | | :- | :- | :- |...

## Description - Re-implemented ADF for logistic regression. - ADF now uses unscented sigma-point Gaussian quadrature in its update step, as suggested by the [original paper](https://ieeexplore.ieee.org/document/4383733). - Standardized the dataset...

Fix convergence issue of the ADF demo.

## Description This PR updates the SNGP demo notebook to self contain implementations of SNGP specific layers for `Spectral Normalization` and `Random Features GP`. The implementation is a simplified port...

As part of the SNGP demo (#819, #983 and #1033), implement the SNGP layers instead of using these from [edward2](https://github.com/google/edward2) library.

### When executing this code: try: from probml_utils import latexify, savefig, is_latexify_enabled except ModuleNotFoundError: %pip install -qq git+https://github.com/probml/probml-utils.git from probml_utils import latexify, savefig, is_latexify_enabled ### Error Obtained: Installing build dependencies...

https://github.com/probml/pyprobml/blob/master/notebooks/book1/13/mixexpDemoOneToMany.ipynb The random initialization of parameters and weights occurs within the E-Step (instead of before it), so it's not really making use of the fit parameters from the M-Step in...

On p.25 of book 1 (in the latest available online version dated back June 2023), in Section 1.5.4.2, it is stated that we often normalize each row of the TF-IDF...