improved-diffusion
improved-diffusion copied to clipboard
RuntimeError: a leaf Variable that requires grad is being used in an in-place operation.
Traceback (most recent call last): File "E:\learn_obj\fq_improved_diffusion\improved_diffusion\dist_util.py", line 73, in sync_params dist.broadcast(p, 0) File "D:\anaconda3\envs\pytorch2.0\lib\site-packages\torch\distributed\distributed_c10d.py", line 1451, in wrapper return func(*args, **kwargs) File "D:\anaconda3\envs\pytorch2.0\lib\site-packages\torch\distributed\distributed_c10d.py", line 1574, in broadcast work.wait() RuntimeError: a leaf Variable that requires grad is being used in an in-place operation.
you can suppress these error messages by commenting out the line-127 "dist_util.sync_params(self.model.parameters())" in the train_util.py file. Because this line of code is not suitable for Windows.