TimotheeMathieu
TimotheeMathieu
For the euclidean distance, we observe the same phenomenon ``` import numpy as np import matplotlib.pyplot as plt from sklearn.metrics.pairwise import euclidean_distances d = 100 N = 100 np.random.seed(42) X...
Thanks for the catch of the bug. Here is the result with the bug corrected, still not in the middle of the cluster: 
Yes it was exactly my point, thank you.
Thanks for the comments. @lorentzenchr what I did is not the Huber loss. It is a robust estimator of the mean applied to the squared errors. I used the MSE...
The IRLS algorithm is a bit implicit I agree. The principle is that we do least squares at each iteration (cf line 392) and this least squares is reweighted using...
> I see your point. I just find calling this option solver a bit confusing, as far as I understand that would mixe two different things between the inner solver...
Hello, Thank you this looks interesting. I think, given the subject, you can try to understand the outline of the code of KMedoids and do the same, the idea is...
kmedoid can be better than kmeans for example for robust purposes. For example, see this [figure](https://wtf.roflcopter.fr/pics/r8xcydMI/dhVnf3TO.png) where kmedoid gives a really good result while kmeans detect any outlier as belonging...
Yes, in fact it is an example I came up for the PR #42, you can find it [here](https://scikit-learn-extra.readthedocs.io/en/latest/auto_examples/plot_robust_kmeans.html#sphx-glr-auto-examples-plot-robust-kmeans-py), I just added k-medoid with default parameters and I got the...
I don't think "[Cos, Sin] matrix is a matrix of real and imaginary components of Fourier transform." is true, if that were the case we could have used fft and...