pymovements
pymovements copied to clipboard
cache public datasets and use them in integration tests
Description of the problem
There were several cases (and #517 is still open) during this project, where going through a complete 10 minutes pymovements tutorial wouldn't have worked for the actual public datasets, besides the toydataset used in the jupyter notebook.
Up to now we ususally decided against testing these public datasets, as downloading all the datasets for each and every commit here on github would be a total waste of resources.
Description of a solution
But there is a github action for caching dependencies: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#comparing-artifacts-and-dependency-caching
It seems that github workers could then reuse resuources for 90 days, which could then regularly trigger a new download of the datasets.