SCALE icon indicating copy to clipboard operation
SCALE copied to clipboard

ValueError: Input contains NaN, infinity or a value too large for dtype('float32').

Open zji90 opened this issue 5 years ago • 1 comments

Hi, I encountered the following error when running SCALE. I have checked my data matrix. The maximum value is 35, the minimum is 0 and there is no na value. Just wondering how to fix the problem? Thanks!

Traceback (most recent call last): File "/home-4/[email protected]/scratch/software/scale/SCALE/SCALE.py", line 134, in pred = model.predict(testloader, device) File "/scratch/users/[email protected]/software/scale/SCALE/scale/model.py", line 97, in predict pred = kmeans.fit_predict(feature) File "/home-4/[email protected]/.local/lib/python3.7/site-packages/sklearn/cluster/k_means_.py", line 998, in fit_predict return self.fit(X, sample_weight=sample_weight).labels_ File "/home-4/[email protected]/.local/lib/python3.7/site-packages/sklearn/cluster/k_means_.py", line 972, in fit return_n_iter=True) File "/home-4/[email protected]/.local/lib/python3.7/site-packages/sklearn/cluster/k_means_.py", line 312, in k_means order=order, copy=copy_x) File "/home-4/[email protected]/.local/lib/python3.7/site-packages/sklearn/utils/validation.py", line 542, in check_array allow_nan=force_all_finite == 'allow-nan') File "/home-4/[email protected]/.local/lib/python3.7/site-packages/sklearn/utils/validation.py", line 56, in _assert_all_finite raise ValueError(msg_err.format(type_err, X.dtype)) ValueError: Input contains NaN, infinity or a value too large for dtype('float32').

zji90 avatar Oct 20 '19 13:10 zji90

See the note on the README page: If come across the nan loss, try another random seed filter peaks with harsher threshold, e.g. -x 0.04 or 0.06 filter low quality cells, e.g. --min_peaks 400 or 600 change the initial learning rate, e.g. --lr 0.0002

this error is not caused by your input data containing nan values, but due to the exploded gradient caused by some outlier samples in training process, filtering some low quality cells or rare peaks usually solve this issue.

jsxlei avatar Oct 20 '19 14:10 jsxlei