ParallelWaveGAN icon indicating copy to clipboard operation
ParallelWaveGAN copied to clipboard

If fine-tuning from pre-trained should generator_scheduler_params be updated?

Open skol101 opened this issue 3 years ago • 0 comments

I'm fine tuning Hifigan from 2.5ml steps pretrained model to 3ml steps.

I wonder if this is the way to go by updating milestones?

generator_optimizer_type: Adam
generator_optimizer_params:
    lr: 2.0e-4
    betas: [0.5, 0.9]
    weight_decay: 0.0
generator_scheduler_type: MultiStepLR
generator_scheduler_params:
    gamma: 0.5
    milestones:
        - 2600000
        - 2700000
        - 2800000
        - 2900000
generator_grad_norm: -1
discriminator_optimizer_type: Adam
discriminator_optimizer_params:
    lr: 2.0e-4
    betas: [0.5, 0.9]
    weight_decay: 0.0
discriminator_scheduler_type: MultiStepLR
discriminator_scheduler_params:
    gamma: 0.5
    milestones:
        - 2600000
        - 2700000
        - 2800000
        - 2900000
discriminator_grad_norm: -1

skol101 avatar Jun 21 '22 11:06 skol101