sklearn-deap icon indicating copy to clipboard operation
sklearn-deap copied to clipboard

AttributeError: Can't get attribute 'Individual' on

Open sparo-jack opened this issue 6 years ago • 6 comments

#24 still have the problem Traceback (most recent call last): File "C:\Users\uesr\Anaconda3\lib\multiprocessing\process.py", line 258, in _bootstrap self.run() File "C:\Users\uesr\Anaconda3\lib\multiprocessing\process.py", line 93, in run self._target(*self._args, **self._kwargs) File "C:\Users\uesr\Anaconda3\lib\multiprocessing\pool.py", line 108, in worker task = get() File "C:\Users\uesr\Anaconda3\lib\multiprocessing\queues.py", line 337, in get return _ForkingPickler.loads(res) AttributeError: Can't get attribute 'Individual' on <module 'deap.creator' from 'C:\Users\uesr\Anaconda3\lib\site-packages\deap\creator.py'>

package version: deap-1.2.2 sklearn-deap-0.2.2

sparo-jack avatar Nov 05 '18 06:11 sparo-jack

I have the same problem:


Process SpawnPoolWorker-9:
Traceback (most recent call last):
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\process.py", line 258, in _bootstrap
    self.run()
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\pool.py", line 108, in worker
    task = get()
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\queues.py", line 337, in get
    return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'Individual' on <module 'deap.creator' from 'C:\\Users\\Omar\\AppData\\Local\\conda\\conda\\envs\\tensor19\\lib\\site-packages\\deap\\creator.py'>
(array([0, 1, 2, 3]), array([3064, 3064, 3064, 3064], dtype=int64))
Process SpawnPoolWorker-10:
Traceback (most recent call last):
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\process.py", line 258, in _bootstrap
    self.run()
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\pool.py", line 108, in worker
    task = get()
  File "C:\Users\Omar\AppData\Local\conda\conda\envs\tensor19\lib\multiprocessing\queues.py", line 337, in get
    return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'Individual' on <module 'deap.creator' from 'C:\\Users\\Omar\\AppData\\Local\\conda\\conda\\envs\\tensor19\\lib\\site-packages\\deap\\creator.py'>

omarcr avatar Mar 12 '19 02:03 omarcr

code only works with 1 worker :(

omarcr avatar Mar 12 '19 02:03 omarcr

This seems to be a bug of the deap library. There is a similar issue here: https://github.com/DEAP/deap/issues/268

I linux I didn't have any problems running many workers and I don't have a windows environment to test it on, sorry.

rsteca avatar Mar 12 '19 13:03 rsteca

I've seen this issue recently. I dont have time to fix it at the moment, but there are some tricks to making it work in the comments of my code in the project. Has to do with defining some attributes ahead of time before calling run. But if that doesnt work for anyone, I think it would be worth having a windows dev debug and write instructions on the process. This is a small codebase, very open and accessible to outside contributions.

ryanpeach avatar Mar 13 '19 13:03 ryanpeach

I am using deap. I encountered the exact same issue and made it work by following exactly as the example mentioned https://github.com/DEAP/deap/blob/master/examples/ga/onemax_mp.py. Make sure the toolbox.register comes before the if name == 'main' and pool=multiprocessing.Pool(processes=4) comes after if name == 'main'.

Yike-Li avatar Oct 31 '19 12:10 Yike-Li

I was running into this error when trying to use joblib to run predict on a model generated with sklearn-deap (EvolutionaryAlgorithmSearchCV to be exact). I was able to solve it by following the examples on this page: https://joblib.readthedocs.io/en/latest/auto_examples/serialization_and_wrappers.html

In particular, wrapping my joblib code in with parallel_backend('multiprocessing'): solved it for me. By the looks of it this only works on UNIX systems, but some of the other solutions discussed on that page might work on Windows as well.

TiesdeKok avatar Dec 10 '19 23:12 TiesdeKok