dl-workshop
dl-workshop copied to clipboard
Crash course to master gradient-based machine learning. Also secretly a JAX course in disguise!
updates: - [github.com/psf/black: 21.11b1 → 21.12b0](https://github.com/psf/black/compare/21.11b1...21.12b0)
For the DP-GMM model, we should be able to avoid singularities by choosing an Inverse Gamma prior on the sigma parameter, I think.
From Michael Tullius (@mvtullius), who came by on SciPy 2021. > Hi Eric, Thank you for your presentation yesterday and for making all the materials available online. I would like...
As per title. Either decide to set it as an exercise or set it as an example.
tl;dr version to copy/paste into a corrected version of a notebook: ```python def vmapped_func(array): result = [] for element in array: result.append(func(element)) result = np.stack(result) return result ```
This would be a great extension to the workshop, as per comment from Andy Long.
`jupyter labextension install @jupyter-widgets/jupyterlab-manager` and related blocks are unclear as to which is needed for which use-case, `jupyter notebook` vs `jupyter-lab`, and so may accidentally be run twice by tutorial...
As per title
As per title.