imbalanced-learn icon indicating copy to clipboard operation
imbalanced-learn copied to clipboard

n_neighbors inconsistency

Open MattEding opened this issue 5 years ago • 5 comments

Description

All the following classes use n_neighbors:

  • ADASYN
  • OneSidedSelection
  • NeighbourhoodCleaningRule
  • NearMiss
  • AllKNN
  • RepeatedEditedNearestNeighbours
  • EditedNearestNeighbours
  • CondensedNearestNeighbour

Whereas k_neighbors is used with SMOTE and all its variants.

This poses a problem with duck-typing and pipelines.

from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import GridSearchCV

from imblearn.pipeline import Pipeline
from imblearn.over_sampling import ADASYN
from imblearn.over_sampling import SMOTE

X, y = ...

smote = SMOTE()
adasyn = ADASYN()
logreg = LogisticRegression()

smote_pipe = Pipeline([('sampler', smote), ('classifier', logreg)])
adasyn_pipe = Pipeline([('sampler', adasyn), ('classifier', logreg)])

params = dict(sampler__n_neighbors=range(3, 6))
smote_grid = GridSearchCV(smote_pipe, params)
adasyn_grid = GridSearchCV(adasyn_pipe, params)

# fails due to k_neighbors instead of n_neighbors
# I am forced to make a new params dict
smote_grid.fit(X, y)

# succeeds
adasyn_grid.fit(X, y)

Expected Results

SMOTE would benefit using n_neighbors to have consistent API.

Versions

Darwin-18.7.0-x86_64-i386-64bit Python 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 14:38:56) [Clang 4.0.1 (tags/RELEASE_401/final)] NumPy 1.17.1 SciPy 1.3.1 Scikit-Learn 0.21.3 Imbalanced-Learn 0.5.0

MattEding avatar Sep 11 '19 19:09 MattEding

I see. Could make sense. It would take 2 versions for the deprecation. However, you still have some other neighbors params in the smote variants as well. It could also be an issue.

You could always create you grid on the fly:

for pipe in [smote_pipe, adasyn_pipe]:
    neighbors_params_name = [p for p in pipeline.get_params().keys() if 'neighbors' in p]
    params = {p: range(3, 6) for p in neighbors_params_name}
    gs_pipe = GridSearchCV(pipe, params)
    gs_pipe.fit(X, y)

glemaitre avatar Sep 18 '19 17:09 glemaitre

I would argue that the extra m_neighbors parameters in SVMSMOTE and BorderlineSMOTE have different meaning than the n/k_neighbors found in other algorithms (and themselves). The n/k_neighbors are used only for finding neighbors, whereas m_neighbors looks to me that its usage is for flagging samples as 'danger' or 'noise'.

I know this is a minor issue that has simple workarounds, but I felt that it was worth marking as an issue nonetheless.

MattEding avatar Sep 19 '19 16:09 MattEding

We could think about modifying this in 1.X since that we will have more freedom to break the API

glemaitre avatar Nov 17 '19 11:11 glemaitre

Additionally, I recently noticed the inconsistency also occurs with self.nn_ vs self.nn_k_ for non-SMOTE and SMOTE repsectively.

MattEding avatar Nov 19 '19 02:11 MattEding

hey! come here from #680

Thanks for your answer.

I know it's more or less complex and need some time for this cycle (waiting for two releases) but, is it going to start?

Thanks

rola93 avatar Feb 03 '20 13:02 rola93