KL-divergence-estimators
KL-divergence-estimators copied to clipboard
Added setup.py
I added a simple setup.py to your package so that it's easily installable and usable in other projects. I wasn't sure about your preferred naming, so feel free to rename everything.
Added install and usage paragraph to README.md
Added figure to show issues with overestimation of kl divergence when sample sizes are different.
Thanks @AndreasGerken! (and apologies for the slow reply!). This looks great (and interesting result on the systematic overestimation, I wasn't aware of that effect.)
Could you remove the ipynb checkpoints (and maybe at that to the .gitignore)? Otherwise looks good to go.