taming-transformers
taming-transformers copied to clipboard
epoch
where to set max epoch. The code is too implicit for me to follow...
It is set to 1000 by default by pytorch lightning. You need to set it in the .yaml file, with something like:
lightning:
trainer:
(...)
max_epochs: 2000
many thanks!
I cannot see any such thing in the yaml file. model: base_learning_rate: 4.5e-6 target: taming.models.vqgan.VQModel params: embed_dim: 256 n_embed: 1024 ddconfig: double_z: False z_channels: 256 resolution: 256 in_channels: 3 out_ch: 3 ch: 128 ch_mult: [ 1,1,2,2,4] # num_down = len(ch_mult)-1 num_res_blocks: 2 attn_resolutions: [16] dropout: 0.0
lossconfig:
target: taming.modules.losses.vqperceptual.VQLPIPSWithDiscriminator
params:
disc_conditional: False
disc_in_channels: 3
disc_start: 30001
disc_weight: 0.8
codebook_weight: 1.0
data: target: main.DataModuleFromConfig params: batch_size: 3 num_workers: 8 train: target: taming.data.faceshq.FacesHQTrain params: size: 256 crop_size: 256 validation: target: taming.data.faceshq.FacesHQValidation params: size: 256 crop_size: 256
Just add new code directly under the. yaml file you need. For example, my dataset is used to train myself, so I added it in the custom_vqgan.yaml file.