Separate lr_configs for generator optimizer and discriminator optimizer
Is there a way to specify a separate lr_config for the generator optimizer, and a separate lr_config for the discriminator optimizer? From my understanding, the lr_config specified in the config file applies to both the generator and the discriminator optimizer.
Sure, you can set a separate learning rate in the optimizer. Example: https://github.com/open-mmlab/mmgeneration/blob/master/configs/base/models/biggan/biggan_128x128.py#L30-L32
I understand that; however, I wish to set up a separate lr_config; for example, using CosineAnnealing for the generator but no annealing for the discriminator. Is this possible?
Got your point, this feature is not supported now. We are currently working on a refactor which covers this feature, probably done by the end of June.
In 1.x, you can set parameter_scheduler like this for your purpose.
param_scheduler = dict(
generator = dict(
type='LinearLrInterval', xxx),
discriminator = dict(type=xxx)
)