imbalanced-learn
imbalanced-learn copied to clipboard
Cannot find reference for the one vs. rest scheme used to extend many algorithms for the multi-class case
In the docs, it's frequently mentioned in the references
Supports multi-class resampling. A one-vs.-rest scheme is used when sampling a class as proposed in [1].
So far, every time I read the referenced paper there was no one-vs-rest scheme described or an extension to multi-class whatsoever. Take for instance, TomekLinks, CondensedNearestNeighbors, EditedNearestNeighbors.
I clearly understand how one-vs-rest works for classification models but I am not sure how is it used to solve the same issue for oversampling or undersampling binary class models in imbalanced-learn
. Assuming that the multi-class extensions are indeed not present in the papers it would be nice if this is explained in the docs.