taming-transformers icon indicating copy to clipboard operation
taming-transformers copied to clipboard

epoch

Open fido20160817 opened this issue 2 years ago • 1 comments

where to set max epoch. The code is too implicit for me to follow...

fido20160817 avatar Jun 10 '22 15:06 fido20160817

It is set to 1000 by default by pytorch lightning. You need to set it in the .yaml file, with something like:

lightning:
    trainer:
        (...)
        max_epochs: 2000

joanrod avatar Jun 26 '22 19:06 joanrod

many thanks!

fido20160817 avatar Sep 25 '22 08:09 fido20160817

I cannot see any such thing in the yaml file. model: base_learning_rate: 4.5e-6 target: taming.models.vqgan.VQModel params: embed_dim: 256 n_embed: 1024 ddconfig: double_z: False z_channels: 256 resolution: 256 in_channels: 3 out_ch: 3 ch: 128 ch_mult: [ 1,1,2,2,4] # num_down = len(ch_mult)-1 num_res_blocks: 2 attn_resolutions: [16] dropout: 0.0

lossconfig:
  target: taming.modules.losses.vqperceptual.VQLPIPSWithDiscriminator
  params:
    disc_conditional: False
    disc_in_channels: 3
    disc_start: 30001
    disc_weight: 0.8
    codebook_weight: 1.0

data: target: main.DataModuleFromConfig params: batch_size: 3 num_workers: 8 train: target: taming.data.faceshq.FacesHQTrain params: size: 256 crop_size: 256 validation: target: taming.data.faceshq.FacesHQValidation params: size: 256 crop_size: 256

sanghamitrajohri avatar Feb 27 '24 20:02 sanghamitrajohri

Just add new code directly under the. yaml file you need. For example, my dataset is used to train myself, so I added it in the custom_vqgan.yaml file. 微信图片_20240507111543

matthew-wave avatar May 07 '24 03:05 matthew-wave