denoising-diffusion-gan
denoising-diffusion-gan copied to clipboard
AttributeError: 'EMA' object has no attribute '_optimizer_state_dict_pre_hooks'
Hello there,
Thanks for the awesome work first. I try to reproduce it on cifar10. However, I got below error. I noticed that the parent class Optimizer has the field '_optimizer_state_dict_pre_hooks', which however is not accessible in EMA.
Any tips will be greatly appreciated.
Epoch 001/1200 [3101/3125] -- errD: 1.4794 | errG: 0.9537 | errD_real: 0.7405 | errD_fake: 0.7389 -- ETA: 13 days, 23:38:04.064760epoch 0 iteration3100, G Loss: 1.1456470489501953, D Loss: 1.4277989864349365 Epoch 001/1200 [3125/3125] -- errD: 1.4790 | errG: 0.9532 | errD_real: 0.7403 | errD_fake: 0.7387 -- ETA: 13 days, 23:31:45.533348 Saving content. python-BaseException Traceback (most recent call last): File "/home/cruk/anaconda3/envs/dev_py310_pt211/lib/python3.10/site-packages/torch/_dynamo/eval_frame.py", line 328, in _fn return fn(*args, **kwargs) File "/home/cruk/anaconda3/envs/dev_py310_pt211/lib/python3.10/site-packages/torch/optim/optimizer.py", line 568, in state_dict for pre_hook in self._optimizer_state_dict_pre_hooks.values(): AttributeError: 'EMA' object has no attribute '_optimizer_state_dict_pre_hooks'. Did you mean: 'register_state_dict_pre_hook'?
Hi,
I wonder if you solved this problem because I have the same issue.
Best,
Hi,
I wonder if you solved this problem because I have the same issue.
Best,
You need to add the two lines to the init method in EMA. There are a few other places that need to be updated accordingly. Hope this helps.
self.param_groups = opt.param_groups
self.defaults = opt.defaults
@qiangwang57 @RiceRollsMan I wonder if you solved this problem because i have same bug in here. I have add these two lines to init method but no use, could you provide your EMA.py? please
including
self._optimizer_state_dict_pre_hooks = OrderedDict()
self._optimizer_state_dict_post_hooks = OrderedDict()
in init method of EMA class work for me
@RiceRollsMan @qiangwang57 @codgodtao
I think the un-inited super class of Optimizer
causes this problem.
add:
super().__init__(params = opt.param_groups, defaults=opt.defaults)
helps to complete the init process.