ENH: decoding module 2025
I'll summarise here my work for GSoC 2025 and describe a few ideas I had throughout for future improvements.
My main goal (following Mike X Cohen's paper) was to implement an sklearn transformer for generalized eigendecomposition (GED) that would generalize algorithms like CSP, xDAWN, etc.
The transformer was implemented in #13259 supporting restriction/whitening for rank-deficient covariances (see the implementation details entry).
GED-based algorithms are spatial filters and essentially separate sources. The SpatialFilter container for their (and LinearModel) visualisation was implemented in #13332 and currently supports scree plot and topomaps for filters and patterns (see, for example, xDAWN example).
- [ ] It would be useful to have a tutorial showing how to implement custom covariance estimation / eigenvalue sorting functions for
_GEDTransformerand investigate the resulting sources withSpatialFilter- [ ] After that
_GEDTransformercould be made public
- [ ] After that
- [ ]
_GEDTransformercould haveinverse_transformsimilarly to howCSPdoes, but generalize it to the "multi" decomposition as in theXdawnTransformercase - [ ] Following ICA visualisation of spatially filtered time-series, it would be nice to have similar function for other spatial filters, but will require adding a new branch for
SpatialFilter(or unifying it withICA) inmne.viz._figure.BrowserBase - [ ]
SpatialFilteris intended for visualisation of multiple spatial filters fixed over time, but there are cases such as EMS, LinearModel on vectorized data, SlidingEstimator wrapping LinearModel (and potentiallyGeneralizingEstimator) where each time point of an epoch can have different pattern. These can be conveniently visualised usingEvokedArrayand could be implemented either as a second use case forSpatialFilteror in an another container inheriting fromEvokedArray, for example - [ ]
mne.preprocessing.Xdawnworks withEpochsand so can't directly inherit from _GEDTransformer, but perhaps _GEDTransformer's logic infitandtransformcould be modularised and then reused inXdawn. - [ ]
SlidingEstimatorandGeneralizingEstimatorcurrently apply wrapped classifier per time-point. This can be generalized to sliding windows, where search lights will pass the windows to the downstream pipeline to cover cases like:SlidingEstimator(make_pipeline(Vectorizer(), SVC()))or, using pyRiemann transformers,SlidingEstimator(make_pipeline(XdawnTransformer(), Covariances(), TangentSpace(), SVC()))for ERP decoding.
The second part of the GSoC was to make the decoding classes more compliant with new sklearn (1.6+) estimator checks and data validation. LinearModel has been made a meta-estimator and reworked in #13361, while sklearn compliance for other classes was corrected in #13393.
Thanks for the summary and the great work @Genuster !