Pierre Yger
Pierre Yger
This is strange, I'll have a look. But not that method='locally_exlusive' with a radius_um = 0 is equivalent to "by_channel" but much faster (using a numba kernel instead of numpy)
I can not reproduce the problem with main. Note that to visualize peaks, once you get peaks, you can turn them to spikes via NumpySorting.from_peaks(peaks, rec.get_sampling_frequency(), unit_ids=recording.channel_ids). And then, once...
A snippet of data would be helpfull, indeed, thanks. I guess it might be related to the option return_scaled (default is False) in getting peaks that is used during estimation...
You sent me a folder and not directly the neuralynx data, but I guess you've tried the error is still there. I'll have a look asap
Actually, on my machine, your snippet (with the recording folder) is working as expected if you set return_scaled=False in the get_traces() call. This is because internally, detect_peaks used raw unscaled...
Actually, this is just a visualization issue. SC2, internally, will always work with unscaled data. So indeed either you set peak detection to pos (and compare that to your data...
Arg but indeed, if you have scaled data with a minus sign in this scaling factor, then you should have peak detection on pos for the sorting. You are absolutly...
Thanks. I'm on holidays right now but we can discuss that during the wired meeting next week ! SC2 is almost there, but clearly on some data, a too low...
Let me jump in the discussion. This is true that SC2 is still evolving (we are starting the paper, so I swear it will settle once for good soon), but...
Yes, such recordings were problematic with former versions of SC2 because when used with lots of cores, the estimate_templates() function called internally during the clustering was preallocating massive amount of...