gsplat
gsplat copied to clipboard
PngCompression issue
I have the following issue when running PngCompression on my customed data:
File "/home/zikai/Desktop/ParamGS/process_data/compress.py", line 71, in <module>
compression_method.compress(save_path, splats)
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/gsplat/compression/png_compression.py", line 106, in compress
meta[param_name] = compress_fn(
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/gsplat/compression/png_compression.py", line 363, in _compress_kmeans
labels = kmeans.fit(x)
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/torchpq/clustering/KMeans.py", line 415, in fit
centroids = self.initialize_centroids(data)
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/torchpq/clustering/KMeans.py", line 272, in initialize_centroids
random_index = np.random.choice(
File "numpy/random/mtrand.pyx", line 1001, in numpy.random.mtrand.RandomState.choice
ValueError: Cannot take a larger sample than population when 'replace=False'
In my case I have only 50000 gaussians (12MB), the problem happened when compressing shN where kmeans is applied. I noticed n_clusters is hardcoded here, is there any way to parse n_clusters? I hardcoded it to 100 in the lib, then I had another problem:
Traceback (most recent call last):
File "/home/zikai/Desktop/ParamGS/process_data/compress.py", line 71, in <module>
compression_method.compress(save_path, splats)
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/gsplat/compression/png_compression.py", line 106, in compress
meta[param_name] = compress_fn(
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/gsplat/compression/png_compression.py", line 363, in _compress_kmeans
labels = kmeans.fit(x)
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/torchpq/clustering/KMeans.py", line 418, in fit
maxsims, labels = self.get_labels(data, centroids) #top1 search
File "/home/zikai/anaconda3/envs/paramgs/lib/python3.10/site-packages/torchpq/clustering/KMeans.py", line 332, in get_labels
if required < remaining:
UnboundLocalError: local variable 'required' referenced before assignment
the same problem also happened when I compressed some public dataset with much more gaussians (127MB). Thank you in advance