mne-hcp
mne-hcp copied to clipboard
Clean-up, update packaging, fix imports
This package is barely maintained, but I still see a question on the forum from time to time; or someone who directly contacts me with questions regarding this package. This PR fixes the deprecated imports, fix styles, updates the packaging,..
WIP
@dengemann Do you remember what the dictionary.txt
file is and if it's actually needed? https://github.com/mne-tools/mne-hcp/blob/master/dictionary.txt It was added as part of the first commit.
Also, @dengemann or @alexrockhill would you have the datasets required to run the tutorials and examples? It's almost the last piece missing to get the build green.
That looks like a codespell
dictionary. Nowadays you shouldn't need to package it separately, just install codespell
and use its built in directories
That's what I assumed and I removed it ;)
@dengemann Do you remember what the
dictionary.txt
file is and if it's actually needed? https://github.com/mne-tools/mne-hcp/blob/master/dictionary.txt It was added as part of the first commit.Also, @dengemann or @alexrockhill would you have the datasets required to run the tutorials and examples? It's almost the last piece missing to get the build green.
Dennis was saying last time that all the data is stored on AWS so that the CIs can run it but we can't store them somewhere accessible because of dataset protections. You can download the dataset here though: https://db.humanconnectome.org/, it's not too hard to do, you just have to register an account and agree to the license.
@larsoner probably this is worth having a look re: https://github.com/mne-tools/mne-python/pull/12731
and: https://github.com/mne-tools/mne-python/blob/071a98fae28bd7c32c6ce357076263ef01031994/doc/sphinxext/related_software.py#L41-L43
@mscheltienne ideally the reformatting changes would be in a separate PR so they could be added to git-blame-revs-ignore
. But maybe that's not worth it here.
I think we also want some very minimal CI runs -- I'll open a quick PR to add those first, then merging main
into this PR should just make sure that things aren't more broken.
IIRC, I started to look into fetching the HCP dataset from AWS to test the changes in this PR and lacked the time. If someone has the time to figure out a gh-action to do so, we could add CIs and a doc build easily.
@mscheltienne I pushed a couple of commits, I've been testing locally with:
$ MNE_HCP_TEST_DECIM=1 pytest -rfEXs --cov-report= --tb=short --durations 30 -v --cov=hcp --cov-report xml hcp
on latest MNE main
after running the new tools/get_testing_data.sh
(~27GB data download!) with an appropriate ~/.s3cfg
. I marked one test as xfail
that we should probably fix, and there is a remaining error on hcp/tests/test_anatomy.py
but I probably just need more files. Will test a bit more tomorrow.
... we should wait for the mne-hcp-testing-data.tar.gz
from @dengemann he generated with this notebook if possible though. @dengemann if you can't find it I might be close to being able to regenerate it. Then might just upload it to osf.io
as an encrypted archive and extract it using a password stored as a GitHub secret.
Okay a couple more commits, now everything works on my system with the full dataset! If anyone else wants to test you can download the ~27GB version then test with:
$ ./tools/get_testing_data.sh
$ MNE_HCP_TESTING_PATH=~/mne-hcp-data/HCP MNE_HCP_TEST_DECIM=1 pytest -rfEXs --cov-report= --tb=short --durations 30 -v --cov=hcp --cov-report xml hcp