mmgeneration icon indicating copy to clipboard operation
mmgeneration copied to clipboard

Separate lr_configs for generator optimizer and discriminator optimizer

Open eehitray opened this issue 3 years ago • 4 comments

Is there a way to specify a separate lr_config for the generator optimizer, and a separate lr_config for the discriminator optimizer? From my understanding, the lr_config specified in the config file applies to both the generator and the discriminator optimizer.

eehitray avatar Apr 10 '22 12:04 eehitray

Sure, you can set a separate learning rate in the optimizer. Example: https://github.com/open-mmlab/mmgeneration/blob/master/configs/base/models/biggan/biggan_128x128.py#L30-L32

plyfager avatar Apr 13 '22 11:04 plyfager

I understand that; however, I wish to set up a separate lr_config; for example, using CosineAnnealing for the generator but no annealing for the discriminator. Is this possible?

eehitray avatar Apr 13 '22 11:04 eehitray

Got your point, this feature is not supported now. We are currently working on a refactor which covers this feature, probably done by the end of June.

plyfager avatar May 23 '22 02:05 plyfager

In 1.x, you can set parameter_scheduler like this for your purpose.

param_scheduler = dict(
generator = dict(
    type='LinearLrInterval',  xxx),
discriminator = dict(type=xxx)
)

plyfager avatar Oct 13 '22 10:10 plyfager