ATFM icon indicating copy to clipboard operation
ATFM copied to clipboard

AttributeError: Can't pickle local object 'Dataset.load_data.<locals>.TempClass'

Open bisandud opened this issue 4 years ago • 2 comments

Good day, Hope this met you well. I tried replication this result this is the error i am having, could you please help me out if possible? Thank you

THE ERROR MESSAGE HERE Preprocessing: Reading HDF5 file(s) Dataset: BikeNYC C:\Users\s324770\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py_hl\dataset.py:313: H5pyDeprecationWarning: dataset.value has been deprecated. Use dataset[()] instead. "Use dataset[()] instead.", H5pyDeprecationWarning) before removing 4392 incomplete days: [] after removing 4392 Preprocessing: Min max normalizing DataFetcher: With Length: 4, 2, 0; with Padding: 0 0, 0 0; with Interval: 1 7. Dumped 0 data. Set lr= 0.0003

AttributeError Traceback (most recent call last) ~\Dropbox\Implement2020\Codes\Attentive Crowd Flow Machines\run_bikenyc.py in 178 tconf = TrainConfiguration() 179 --> 180 run(dconf, tconf)

~\Dropbox\Implement2020\Codes\Attentive Crowd Flow Machines\run_bikenyc.py in run(dconf, tconf) 132 133 model.train() --> 134 for i, (X, X_ext, Y, Y_ext) in enumerate(train_loader, 0): 135 X = X.cuda() 136 X_ext = X_ext.cuda()

~\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\utils\data\dataloader.py in iter(self) 277 return _SingleProcessDataLoaderIter(self) 278 else: --> 279 return _MultiProcessingDataLoaderIter(self) 280 281 @property

~\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\utils\data\dataloader.py in init(self, loader) 717 # before it starts, and del tries to join but will get: 718 # AssertionError: can only join a started process. --> 719 w.start() 720 self._index_queues.append(index_queue) 721 self._workers.append(w)

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\process.py in start(self) 110 'daemonic processes are not allowed to have children' 111 _cleanup() --> 112 self._popen = self._Popen(self) 113 self._sentinel = self._popen.sentinel 114 # Avoid a refcycle if the target function holds an indirect

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\context.py in _Popen(process_obj) 221 @staticmethod 222 def _Popen(process_obj): --> 223 return _default_context.get_context().Process._Popen(process_obj) 224 225 class DefaultContext(BaseContext):

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\context.py in _Popen(process_obj) 320 def _Popen(process_obj): 321 from .popen_spawn_win32 import Popen --> 322 return Popen(process_obj) 323 324 class SpawnContext(BaseContext):

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\popen_spawn_win32.py in init(self, process_obj) 87 try: 88 reduction.dump(prep_data, to_child) ---> 89 reduction.dump(process_obj, to_child) 90 finally: 91 set_spawning_popen(None)

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\reduction.py in dump(obj, file, protocol) 58 def dump(obj, file, protocol=None): 59 '''Replacement for pickle.dump() using ForkingPickler.''' ---> 60 ForkingPickler(file, protocol).dump(obj) 61 62 #

AttributeError: Can't pickle local object 'Dataset.load_data..TempClass'

bisandud avatar Mar 21 '20 15:03 bisandud

What is your environment setting?

It is better to run our code on Linux Ubuntu. Window is not recommend.

liulingbo918 avatar Apr 02 '20 11:04 liulingbo918

Good day, Hope this met you well. I tried replication this result this is the error i am having, could you please help me out if possible? Thank you

THE ERROR MESSAGE HERE

Preprocessing: Reading HDF5 file(s) Dataset: BikeNYC C:\Users\s324770\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py_hl\dataset.py:313: H5pyDeprecationWarning: dataset.value has been deprecated. Use dataset[()] instead. "Use dataset[()] instead.", H5pyDeprecationWarning) before removing 4392 incomplete days: [] after removing 4392 Preprocessing: Min max normalizing DataFetcher: With Length: 4, 2, 0; with Padding: 0 0, 0 0; with Interval: 1 7. Dumped 0 data. Set lr= 0.0003 AttributeError Traceback (most recent call last) ~\Dropbox\Implement2020\Codes\Attentive Crowd Flow Machines\run_bikenyc.py in 178 tconf = TrainConfiguration() 179 --> 180 run(dconf, tconf)

~\Dropbox\Implement2020\Codes\Attentive Crowd Flow Machines\run_bikenyc.py in run(dconf, tconf) 132 133 model.train() --> 134 for i, (X, X_ext, Y, Y_ext) in enumerate(train_loader, 0): 135 X = X.cuda() 136 X_ext = X_ext.cuda()

~\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\utils\data\dataloader.py in iter(self) 277 return _SingleProcessDataLoaderIter(self) 278 else: --> 279 return _MultiProcessingDataLoaderIter(self) 280 281 @Property

~\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\utils\data\dataloader.py in init(self, loader) 717 # before it starts, and del tries to join but will get: 718 # AssertionError: can only join a started process. --> 719 w.start() 720 self._index_queues.append(index_queue) 721 self._workers.append(w)

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\process.py in start(self) 110 'daemonic processes are not allowed to have children' 111 _cleanup() --> 112 self._popen = self._Popen(self) 113 self._sentinel = self._popen.sentinel 114 # Avoid a refcycle if the target function holds an indirect

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\context.py in _Popen(process_obj) 221 @staticmethod 222 def _Popen(process_obj): --> 223 return _default_context.get_context().Process._Popen(process_obj) 224 225 class DefaultContext(BaseContext):

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\context.py in _Popen(process_obj) 320 def _Popen(process_obj): 321 from .popen_spawn_win32 import Popen --> 322 return Popen(process_obj) 323 324 class SpawnContext(BaseContext):

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\popen_spawn_win32.py in init(self, process_obj) 87 try: 88 reduction.dump(prep_data, to_child) ---> 89 reduction.dump(process_obj, to_child) 90 finally: 91 set_spawning_popen(None)

~\AppData\Local\Continuum\anaconda3\lib\multiprocessing\reduction.py in dump(obj, file, protocol) 58 def dump(obj, file, protocol=None): 59 '''Replacement for pickle.dump() using ForkingPickler.''' ---> 60 ForkingPickler(file, protocol).dump(obj) 61 62 #

AttributeError: Can't pickle local object 'Dataset.load_data..TempClass'

I also encountered this problem on Windows, but I didn't see any solution. and the num_workers=0 in DataLoader solved the problem.

Leo-Shaw avatar Feb 20 '23 16:02 Leo-Shaw