CJ Carey
CJ Carey
Looks like either `Lx` or `impostors` was empty when computing the gradient of the loss. Is `X_train` a numpy array of Pandas dataframe in your call to `grid_lmnn_knn.fit(X_train,y_train)`? If it's...
I reproduced the issue locally, and it turns out that `impostors` is indeed empty when computing the gradient. See similar issue gh-17 which apparently didn't result in a fix for...
Here's a workaround. It just bails out entirely if no impostors can be found: https://github.com/scikit-learn-contrib/metric-learn/commit/612fcc4c74991dd377bc7aa9ea1741b9e8bc4f14 Not super elegant, but it should work okay.
Sorry, try `if len(impostors) == 0:` instead. On Sun, Apr 4, 2021 at 2:31 PM Angelo Cortez ***@***.***> wrote: > Tried swapping it to the following, but it now goes...
Were you able to try the code from gh-309? I'm curious to see how it would handle this case.
I resolved merge conflicts, but there are still a bunch of review items to address here.
I like the idea of allowing an incremental fit, but we should be careful about how the API should look. Perhaps we can follow the scikit-learn convention and add a...
I haven't considered keeping track of per-iteration objective function values, though I see why it might be useful in some cases. My main concern is that this might negatively impact...
Note that we also have gh-13 tracking other requested algorithms. Let's keep that list updated as new algorithms are proposed/implemented. I'm in favor of adding more algorithm diversity to the...
This change looks fine to me, though I'm not sure when this warm-start option is useful in practice. Sorry for the extreme delay in reviewing! @grudloff want to take a...