moabb
moabb copied to clipboard
Save results as soon as they are computed
As I understand, rn we save the results to the hdf5 file when the evaluation of a whole dataset is over, and then we continue with the next dataset: https://github.com/NeuroTechX/moabb/blob/a9f2e4ca3abc02e5fb81d36da06760f11653e550/moabb/evaluations/base.py#L165
But if we evaluate many pipelines on the same dataset, the chances of failing before the end of the evaluation are high. And if we fail all the results for the dataset are lost... Would it be possible to save the results as soon as they are computed (i.e. after every fold of every session of every subject)?
Potential issues I see:
- parallel access to the hdf5 file
- computational overhead due to accessing the hdf5 more often
What do you think?
I love this idea, it is fundamental, but I don't have much experience in how to do and change the hd5f. Can you open a PR, and we discuss it with the code?