torch_kmeans icon indicating copy to clipboard operation
torch_kmeans copied to clipboard

A question about soft-kmeans

Open wayc04 opened this issue 7 months ago • 0 comments

Hello, thank you for providing the toolkit. I have a question while using it:

from torch_kmeans import KMeans, SoftKMeans

import torch
skmeans = SoftKMeans(n_clusters=2, n_init=20, temp=5, max_iter=1000, tol=1e-8, init_method='k-means++')
z = torch.tensor([[1.0, 1.0, 1.0], [2.0, 2.0, 2.0], [2.0, 2.1, 2.0]])
z = z.unsqueeze(0)
result = skmeans(z)

For this simple example, why are the first and third samples grouped into the same cluster? Is this due to the nature of the algorithm?

wayc04 avatar Mar 19 '25 16:03 wayc04