spikeinterface icon indicating copy to clipboard operation
spikeinterface copied to clipboard

Sorting error when using joblib

Open mfvd opened this issue 3 years ago • 9 comments

Hello,

I'm trying to run muntsinsort4 in parallel on a new computer (ubuntu 20.04.4 LTS) and get the error at the bottom of this issue. I've reinstalled joblib but it did not correct the issue. I'm trying to sort a 32 tetrode dataset to which I added probe information as follows:

# define tetrode geometry and channel indices
tts = pi.ProbeGroup()
for i in range(32):
    tetrode = pi.generate_tetrode()
    tetrode.move([i * 50, 0])
    tts.add_probe(tetrode)
# set indices
tts.set_global_device_channel_indices(channel_indices)

# set probe to recording object
rec_tts = rec.set_probegroup(tts, 
                             group_mode='by_probe')

For reference the code I'm using for sorting is the following:

# get dafault params for ms4
default_ms4_params = ss.get_default_params('mountainsort4')
# set params
default_ms4_params['detect_threshold'] = 4
default_ms4_params['detect_sign'] = -1
default_ms4_params['filter'] = False
default_ms4_params['whiten'] = True

# run sorter
sorted_ms4 = ss.run_sorter_by_property(sorter_name = 'mountainsort4',
                                       recording = rec_cached,
                                       working_folder = 'ms4_output',  # 'sort_'+str(session)
                                       grouping_property = 'group',
                                       engine="joblib",
                                       engine_kwargs={"n_jobs": 16},
                                       verbose = False,
                                       **default_ms4_params)


ERROR:

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording
Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording
Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording
Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording
Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording
Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 125, in _main
    prepare(preparation_data)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/spawn.py", line 231, in prepare
    set_start_method(data['start_method'], force=True)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 247, in set_start_method
    self._actual_context = self.get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 239, in get_context
    return super().get_context(method)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/multiprocessing/context.py", line 193, in get_context
    raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'loky'

---------------------------------------------------------------------------
_RemoteTraceback                          Traceback (most recent call last)
_RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/externals/loky/process_executor.py", line 436, in _process_worker
    r = call_item()
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/externals/loky/process_executor.py", line 288, in __call__
    return self.fn(*self.args, **self.kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/_parallel_backends.py", line 595, in __call__
    return self.func(*args, **kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py", line 262, in __call__
    return [func(*args, **kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py", line 262, in <listcomp>
    return [func(*args, **kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/launcher.py", line 33, in _run_one
    run_sorter(sorter_name, recording, output_folder=output_folder,
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/runsorter.py", line 67, in run_sorter
    return run_sorter_local(sorter_name, recording, output_folder=output_folder,
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/runsorter.py", line 91, in run_sorter_local
    sorting = SorterClass.get_result_from_folder(output_folder)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/basesorter.py", line 253, in get_result_from_folder
    raise SpikeSortingError(
spikeinterface.sorters.utils.misc.SpikeSortingError: Spike sorting failed. You can inspect the runtime trace in spikeinterface_log.json
"""

The above exception was the direct cause of the following exception:

SpikeSortingError                         Traceback (most recent call last)
Input In [7], in <cell line: 11>()
      8 default_ms4_params['whiten'] = True
     10 # run sorter
---> 11 sorted_ms4 = ss.run_sorter_by_property(sorter_name = 'mountainsort4',
     12                                        recording = rec_cashed,
     13                                        working_folder = 'ms4_output',  # 'sort_'+str(session)
     14                                        grouping_property = 'group',
     15                                        engine="joblib",
     16                                        engine_kwargs={"n_jobs": 16},
     17                                        verbose = False,
     18                                        **default_ms4_params)
     20 print('Elapsed time: ', time.time()-t_start)
     21 print('Total: '+str(len(sorted_ms4.get_unit_ids()))+' units.')

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/launcher.py:114, in run_sorter_by_property(sorter_name, recording, grouping_property, working_folder, mode_if_folder_exists, engine, engine_kwargs, verbose, docker_image, singularity_image, **sorter_params)
    111 assert grouping_property in recording.get_property_keys(), f"The 'grouping_property' {grouping_property} is not " \
    112                                                            f"a recording property!"
    113 recording_dict = recording.split_by(grouping_property)    
--> 114 sorting_output = run_sorters([sorter_name], recording_dict, working_folder,
    115                              mode_if_folder_exists=mode_if_folder_exists,
    116                              engine=engine,
    117                              engine_kwargs=engine_kwargs,
    118                              verbose=verbose,
    119                              with_output=True,
    120                              docker_images={sorter_name: docker_image},
    121                              singularity_images={sorter_name: singularity_image},
    122                              sorter_params={sorter_name: sorter_params})
    124 grouping_property_values = None
    125 sorting_list = []

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/launcher.py:270, in run_sorters(sorter_list, recording_dict_or_list, working_folder, sorter_params, mode_if_folder_exists, engine, engine_kwargs, verbose, with_output, docker_images, singularity_images)
    268     n_jobs = engine_kwargs.get('n_jobs', -1)
    269     backend = engine_kwargs.get('backend', 'loky')
--> 270     Parallel(n_jobs=n_jobs, backend=backend)(
    271         delayed(_run_one)(task_args) for task_args in task_args_list)
    273 elif engine == 'dask':
    274     client = engine_kwargs.get('client', None)

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py:1056, in Parallel.__call__(self, iterable)
   1053     self._iterating = False
   1055 with self._backend.retrieval_context():
-> 1056     self.retrieve()
   1057 # Make sure that we get a last message telling us we are done
   1058 elapsed_time = time.time() - self._start_time

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py:935, in Parallel.retrieve(self)
    933 try:
    934     if getattr(self._backend, 'supports_timeout', False):
--> 935         self._output.extend(job.get(timeout=self.timeout))
    936     else:
    937         self._output.extend(job.get())

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/_parallel_backends.py:542, in LokyBackend.wrap_future_result(future, timeout)
    539 """Wrapper for Future.result to implement the same behaviour as
    540 AsyncResults.get from multiprocessing."""
    541 try:
--> 542     return future.result(timeout=timeout)
    543 except CfTimeoutError as e:
    544     raise TimeoutError from e

File ~/anaconda2/envs/si_openephys/lib/python3.9/concurrent/futures/_base.py:446, in Future.result(self, timeout)
    444     raise CancelledError()
    445 elif self._state == FINISHED:
--> 446     return self.__get_result()
    447 else:
    448     raise TimeoutError()

File ~/anaconda2/envs/si_openephys/lib/python3.9/concurrent/futures/_base.py:391, in Future.__get_result(self)
    389 if self._exception:
    390     try:
--> 391         raise self._exception
    392     finally:
    393         # Break a reference cycle with the exception in self._exception
    394         self = None

SpikeSortingError: Spike sorting failed. You can inspect the runtime trace in spikeinterface_log.json




mfvd avatar Jul 18 '22 11:07 mfvd

Can you try pip install loky?

alejoe91 avatar Jul 18 '22 12:07 alejoe91

Hi Allesio,

loky is already installed with version 3.1.0

mfvd avatar Jul 18 '22 13:07 mfvd

Maybe upgrade joblib.

samuelgarcia avatar Jul 18 '22 15:07 samuelgarcia

Hi Samuel,

Unfortunately I'm using the latest version of joblib. Reinstalling loky and joblib did not solve the issue.

mfvd avatar Jul 18 '22 16:07 mfvd

Oups. Can you try with "n_jobs": 1 and check if it works without multi processing ?

samuelgarcia avatar Jul 19 '22 08:07 samuelgarcia

@samuelgarcia In the meanwhile I managed to run the sorter by not specifying the engine, neither the n_jobs. but not when multi processing.

Also, caching a recording object, extracting waveforms and exporting to phy are extremely slow and can only be done when not using parallel processing. In fact, when extracting waveforms or exporting to phy, the kernel of my notebook crashes almost immediately.

mfvd avatar Jul 19 '22 13:07 mfvd

extacting waveforms is another story because for this we do not use joblib.

Can you try run sorter with joblib but with another sorter ?

And can you give us the error when extracting waveforms witj n_jobs>1 ?

samuelgarcia avatar Jul 19 '22 19:07 samuelgarcia

@samuelgarcia Sorry, I finally managed to do this and get the error below when trying to sort with tridesclous. I included warngins about the recording not having a probe because I defined my tetrodes beforehand (see original post on this thread).

home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')

/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/sklearn/decomposition/_truncated_svd.py:268: RuntimeWarning: invalid value encountered in true_divide
  self.explained_variance_ratio_ = exp_var / full_var
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')
/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/core/baserecording.py:466: UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions
  warn('There is no Probe attached to this recording. Creating a dummy one with contact positions')

---------------------------------------------------------------------------
_RemoteTraceback                          Traceback (most recent call last)
_RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/externals/loky/process_executor.py", line 436, in _process_worker
    r = call_item()
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/externals/loky/process_executor.py", line 288, in __call__
    return self.fn(*self.args, **self.kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/_parallel_backends.py", line 595, in __call__
    return self.func(*args, **kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py", line 262, in __call__
    return [func(*args, **kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py", line 262, in <listcomp>
    return [func(*args, **kwargs)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/launcher.py", line 33, in _run_one
    run_sorter(sorter_name, recording, output_folder=output_folder,
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/runsorter.py", line 67, in run_sorter
    return run_sorter_local(sorter_name, recording, output_folder=output_folder,
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/runsorter.py", line 91, in run_sorter_local
    sorting = SorterClass.get_result_from_folder(output_folder)
  File "/home/neuroimaging/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/basesorter.py", line 253, in get_result_from_folder
    raise SpikeSortingError(
spikeinterface.sorters.utils.misc.SpikeSortingError: Spike sorting failed. You can inspect the runtime trace in spikeinterface_log.json
"""

The above exception was the direct cause of the following exception:

SpikeSortingError                         Traceback (most recent call last)
Input In [14], in <cell line: 7>()
      4 default_ms4_params['detect_sign'] = -1
      6 # run sorter
----> 7 sorted_ms4 = ss.run_sorter_by_property(sorter_name = 'tridesclous',
      8                                        recording = rec_cashed,
      9                                        working_folder = 'tc_output',  # 'sort_'+str(session)
     10                                        grouping_property = 'group',
     11                                        engine="joblib",
     12                                        engine_kwargs={"n_jobs": 16},
     13                                        verbose = False,
     14                                        **default_ms4_params)

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/launcher.py:114, in run_sorter_by_property(sorter_name, recording, grouping_property, working_folder, mode_if_folder_exists, engine, engine_kwargs, verbose, docker_image, singularity_image, **sorter_params)
    111 assert grouping_property in recording.get_property_keys(), f"The 'grouping_property' {grouping_property} is not " \
    112                                                            f"a recording property!"
    113 recording_dict = recording.split_by(grouping_property)    
--> 114 sorting_output = run_sorters([sorter_name], recording_dict, working_folder,
    115                              mode_if_folder_exists=mode_if_folder_exists,
    116                              engine=engine,
    117                              engine_kwargs=engine_kwargs,
    118                              verbose=verbose,
    119                              with_output=True,
    120                              docker_images={sorter_name: docker_image},
    121                              singularity_images={sorter_name: singularity_image},
    122                              sorter_params={sorter_name: sorter_params})
    124 grouping_property_values = None
    125 sorting_list = []

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/spikeinterface/sorters/launcher.py:270, in run_sorters(sorter_list, recording_dict_or_list, working_folder, sorter_params, mode_if_folder_exists, engine, engine_kwargs, verbose, with_output, docker_images, singularity_images)
    268     n_jobs = engine_kwargs.get('n_jobs', -1)
    269     backend = engine_kwargs.get('backend', 'loky')
--> 270     Parallel(n_jobs=n_jobs, backend=backend)(
    271         delayed(_run_one)(task_args) for task_args in task_args_list)
    273 elif engine == 'dask':
    274     client = engine_kwargs.get('client', None)

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py:1056, in Parallel.__call__(self, iterable)
   1053     self._iterating = False
   1055 with self._backend.retrieval_context():
-> 1056     self.retrieve()
   1057 # Make sure that we get a last message telling us we are done
   1058 elapsed_time = time.time() - self._start_time

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/parallel.py:935, in Parallel.retrieve(self)
    933 try:
    934     if getattr(self._backend, 'supports_timeout', False):
--> 935         self._output.extend(job.get(timeout=self.timeout))
    936     else:
    937         self._output.extend(job.get())

File ~/anaconda2/envs/si_openephys/lib/python3.9/site-packages/joblib/_parallel_backends.py:542, in LokyBackend.wrap_future_result(future, timeout)
    539 """Wrapper for Future.result to implement the same behaviour as
    540 AsyncResults.get from multiprocessing."""
    541 try:
--> 542     return future.result(timeout=timeout)
    543 except CfTimeoutError as e:
    544     raise TimeoutError from e

File ~/anaconda2/envs/si_openephys/lib/python3.9/concurrent/futures/_base.py:439, in Future.result(self, timeout)
    437     raise CancelledError()
    438 elif self._state == FINISHED:
--> 439     return self.__get_result()
    441 self._condition.wait(timeout)
    443 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:

File ~/anaconda2/envs/si_openephys/lib/python3.9/concurrent/futures/_base.py:391, in Future.__get_result(self)
    389 if self._exception:
    390     try:
--> 391         raise self._exception
    392     finally:
    393         # Break a reference cycle with the exception in self._exception
    394         self = None

SpikeSortingError: Spike sorting failed. You can inspect the runtime trace in spikeinterface_log.json

mfvd avatar Jul 22 '22 10:07 mfvd

Hello, I am experiencing the same "UserWarning: There is no Probe attached to this recording. Creating a dummy one with contact positions" when running:

sorting_TDC = si.run_sorter_by_property("tridesclous", recording_tetrodes, grouping_property="group", working_folder="tds_by_group", engine="joblib", engine_kwargs={"n_jobs": 8},**tdc_sorter_params)

This is the case for tridesclous and spyking circus and I do not get the error if the default engine settings are used.

Zoe0793 avatar Aug 19 '22 21:08 Zoe0793

@mfvd @Zoe0793 can we close this?

alejoe91 avatar Jun 12 '23 14:06 alejoe91

I don't have this issue anymore, Thanks, Zoe

On Mon, 12 Jun 2023, 10:44 Alessio Buccino, @.***> wrote:

@mfvd https://github.com/mfvd @Zoe0793 https://github.com/Zoe0793 can we close this?

— Reply to this email directly, view it on GitHub https://github.com/SpikeInterface/spikeinterface/issues/823#issuecomment-1587493039, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMB6AJ4G53E5QYPG5B3PNB3XK4TOBANCNFSM53346LXQ . You are receiving this because you were mentioned.Message ID: @.***>

Zoe0793 avatar Jun 12 '23 14:06 Zoe0793