Results 32 comments of Pierre Guetschel

Yes agreed, I only tested the BIDS conversion for a few datasets. I don't think these new tests should be executed every time we push a new commit to GitHub...

- [ ] If this feature gets implemented, we should demonstrate it in the disk_cache example

> I like the idea, maybe it is help to better control the experiments. Can you update the poetry to include fire? @bruAristimunha I removed the `run.py` file and updated...

#429 and #430 would be easy to implement and nice to have in 0.6. Let's discuss them during the meeting

Thanks @tomMoral for your feedback!! But not sure if this would completely work because have some quite specific constraints: - One of the most expensive steps is the call to...

Yes, the idea is to have the git sha of the HEAD written somewhere in the generated HTML files so that we are sure that there is always a difference...

Why are we not using `actions/cache/restore@v3` to get the build cache here: https://github.com/NeuroTechX/moabb/blob/29c04234bc17af807a73f9cfdecfc1b995281efa/.github/workflows/docs.yml#L112 See here: https://github.com/actions/cache/blob/v3.3.3/caching-strategies.md#make-cache-read-only--reuse-cache-from-centralized-job

Why is the cache named "Cache datasets and docs"? I don't see where we are caching the datasets

I think we should use a "short lived cache" for the docs build: https://github.com/actions/cache/blob/v3.3.3/caching-strategies.md#creating-a-short-lived-cache Otherwise there might be interactions between different workflow runs

Now trying to only fix the caching issue in PR #632 Then we will deal with the deployment issue in another PR. (Let’s keep this PR to remember what we...