VAR
VAR copied to clipboard
Training code for VAE
Thanks for the great work and the released code base! 💯
After checking current training code, I notice that the VAE is loading from the pretrained checkpoint vae_ch160v4096z32.pth.
As mentioned #5, the training code for VAE will be released at https://github.com/FoundationVision/vae-pro. Is there a target date for it? I'm very interested in training a VAE model on my custom dataset.
Same request here, @FoundationVision any plan to release the VAE training code?
@kl2004 @kl2004 we're actively cleaning the codes now. For a temporary reference you can refer to the VQVAE forward at https://github.com/FoundationVision/VAR/blob/main/models/vqvae.py#L56-L59.
thanks,
def forward(self, inp, ret_usages=False): # -> rec_B3HW, idx_N, loss
VectorQuantizer2.forward
h_BChw, usages, vq_loss, mean_entropy_loss = self.quantize(self.quant_conv(self.encoder(inp)), ret_usages=ret_usages)
return self.decoder(self.post_quant_conv(h_BChw)), usages, vq_loss, mean_entropy_loss
does this runable code?
I would like to if the evaluation (e.g. FID LPIPS PSNR, etc.) code of VAE can be released?
Still looking forward to it...
Thanks for the great work @FoundationVision , still looking forward to the training code for multi-res VQVAE!
Still lokking forward
Still looking forward!
Still looking forward!
Still looking forward!
Still looking forward!
每天都来看一眼开源没👁️ 👁️
泪目啊,啥时候能开源
Dear all @kl2004 @luohao123 @RobertLuo1 @eanson023 @ArmeriaWang @FanqingM @StarCycle @Junda24 @HalvesChen @SunzeY @z379035389
Many thanks for your patience! Our VAE training and inference codebase and model weights would be released next week. Please stay tuned❤️!
I'll post the link here and in the README.
Still looking forward!
Dear all @kl2004 @luohao123 @RobertLuo1 @eanson023 @ArmeriaWang @FanqingM @StarCycle @Junda24 @HalvesChen @SunzeY @z379035389
Many thanks for your patience! Our VAE training and inference codebase and model weights would be released next week. Please stay tuned❤️!
I'll post the link here and in the README.
Still looking forward!
Still didn't opensouce?
Try ByteDance's work! They have a better work than this
What is the project link?Thanks!
Hi @keyu-tian, is the VAE training code being released at https://github.com/FoundationVision/vaex?
Still looking forward!
Hi @keyu-tian, are you still planning to release the training code? Looking forward to it!
No, not planing.
Hi Guys,
I am trying to train the VQVAE (with multi-scale VQ) using the LlamaGen VAE training code. However, I found the codebook utilization rate remains low (about 58%). Does anyone have a good idea about this? Thanks!