DAN icon indicating copy to clipboard operation
DAN copied to clipboard

group["initial_lr"]: unsupported operand type(s) for *: 'NoneType' and 'int'

Open Amer-Alhamvi opened this issue 2 years ago • 5 comments

Traceback (most recent call last): File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\train.py", line 349, in <module> main() File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\train.py", line 207, in main model = create_model(opt) # load pretrained model of SFTMD File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\__init__.py", line 17, in create_model m = M(opt) File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\blind_model.py", line 90, in __init__ lr_scheduler.MultiStepLR_Restart( File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\lr_scheduler.py", line 27, in __init__ super(MultiStepLR_Restart, self).__init__(optimizer, last_epoch) File "C:\dev\PycharmInterpreters\PyTorchStar\lib\site-packages\torch\optim\lr_scheduler.py", line 77, in __init__ self.step() File "C:\dev\PycharmInterpreters\PyTorchStar\lib\site-packages\torch\optim\lr_scheduler.py", line 154, in step values = self.get_lr() File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\lr_scheduler.py", line 34, in get_lr return [ File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\lr_scheduler.py", line 35, in <listcomp> group["initial_lr"] * weight for group in self.optimizer.param_groups TypeError: unsupported operand type(s) for *: 'NoneType' and 'int'

PyTorch 1.11 and 1.5. tried both.

Amer-Alhamvi avatar Aug 19 '22 09:08 Amer-Alhamvi

The training .yml file don't set the learning rate of the estimater and you need add it into the .yml file, which is the reason of the 'NoneType'.

jiangmengyu18 avatar Oct 04 '22 13:10 jiangmengyu18

Then how to set?

Lincoln20030413 avatar Jan 07 '23 11:01 Lincoln20030413

Traceback (most recent call last): File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\train.py", line 349, in <module> main() File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\train.py", line 207, in main model = create_model(opt) # load pretrained model of SFTMD File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\__init__.py", line 17, in create_model m = M(opt) File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\blind_model.py", line 90, in __init__ lr_scheduler.MultiStepLR_Restart( File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\lr_scheduler.py", line 27, in __init__ super(MultiStepLR_Restart, self).__init__(optimizer, last_epoch) File "C:\dev\PycharmInterpreters\PyTorchStar\lib\site-packages\torch\optim\lr_scheduler.py", line 77, in __init__ self.step() File "C:\dev\PycharmInterpreters\PyTorchStar\lib\site-packages\torch\optim\lr_scheduler.py", line 154, in step values = self.get_lr() File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\lr_scheduler.py", line 34, in get_lr return [ File "C:\dev\PycharmProjects\DAN\codes\config\DANv2\models\lr_scheduler.py", line 35, in <listcomp> group["initial_lr"] * weight for group in self.optimizer.param_groups TypeError: unsupported operand type(s) for *: 'NoneType' and 'int'

PyTorch 1.11 and 1.5. tried both.

Hello, do you solve it? Can you help me?

Lincoln20030413 avatar Jan 07 '23 12:01 Lincoln20030413

The training .yml file don't set the learning rate of the estimater and you need add it into the .yml file, which is the reason of the 'NoneType'.

Sorry, but it can't be solved

Lincoln20030413 avatar Jan 07 '23 12:01 Lincoln20030413

Hello, You must assign value for "lr_E" in train section of .yml file (e.g. lr_E: !!float 4e-4)

eze1376 avatar Nov 21 '23 07:11 eze1376