tanbuzheng
tanbuzheng
Thanks for your reply! It was very kind of you! But if so, how should I set self.fake_class_label? Is it reasonable to set it with any value within 0-1024?أ‿أ
OK! Thanks a lot!
By the way, the current code does not seem to contain the contrastive loss part, I would like to ask if you have any plans to release the complete training...
Hello, auther! Did you re-train the VQGAN by yourself? It seems different from the pre-trained model released by [VQGAN.](https://heibox.uni-heidelberg.de/d/8088892a516d4e3baf92/) So if I want to apply MAGE to the other datasets,...
Thanks a lot! But I would like to ask, why don't you just use the pre-trained model provided by VQGAN? Is there any problem here?
I got it!Thank you so much!
Sorry to bother you again. My computing resources are limited, I wonder if I don't use contrastive loss or just use moco v2 in MAGE training, can I set the...
Ok, thanks again!
Hello, auther!Sorry to bother you again. According to my understanding, mage can be trained on a v100 when batchsize=64. Recently I made a preliminary attempt to train mage on a...
Thank you very much! I just fund out that I was training MAGE-L. I should try to train MAGE-B instead.