kmeans_pytorch
kmeans_pytorch copied to clipboard
kmeans using PyTorch
Your method helps me a lot, but how do I define the initial cluster center myself ? Similar to the "init" parameter of the kmeans method in Sklearn
Hi, I was executing the example and got the following warning: ``` [running kmeans]: 0it [00:00, ?it/s] running k-means on cpu.. /pytorch/torch/csrc/utils/python_arg_parser.cpp:756: UserWarning: This overload of nonzero is deprecated: nonzero(Tensor...
Does this method support the clustering of batch data, that is, the data that is sent in (batch-size,n,dim), and does it support the clustering of n sample points with dim...
I want to choose the number of iterations but when i put iter_limit = xxx it give the error that the parameter doesn't exist
Hi! Thanks for the great tool! I noticed that the output of `kmeans()` is forced to be on the CPU regardless of what device was requested. Line 121 [here](https://github.com/subhadarship/kmeans_pytorch/blob/a65871651e9b38f89fa2bf0b02c0170bf40b52bf/kmeans_pytorch/__init__.py#L121) ```...
Hi, can you explain how to find the optimal k for unsupervised learning, like elbow method? Thanks
The example.ipynb finds a different center_embedding for every running . How to resolve it?
A simple example to reproduce this issue: `import torch import numpy as np import matplotlib.pyplot as plt from kmeans_pytorch import kmeans, kmeans_predict np.random.seed(123) data_size, dims, num_clusters = 1000, 200, 3...
I'm going to try to do 64 clusters kmeans for 50,000 datasets with 512 dimensions, and the following error occurs. `running k-means on cuda.. [running kmeans]: 0it [00:00, ?it/s]tcmalloc: large...
How to set the num_features?