torch_kmeans
torch_kmeans copied to clipboard
A question about soft-kmeans
Hello, thank you for providing the toolkit. I have a question while using it:
from torch_kmeans import KMeans, SoftKMeans
import torch
skmeans = SoftKMeans(n_clusters=2, n_init=20, temp=5, max_iter=1000, tol=1e-8, init_method='k-means++')
z = torch.tensor([[1.0, 1.0, 1.0], [2.0, 2.0, 2.0], [2.0, 2.1, 2.0]])
z = z.unsqueeze(0)
result = skmeans(z)
For this simple example, why are the first and third samples grouped into the same cluster? Is this due to the nature of the algorithm?