ViTGAN
ViTGAN copied to clipboard
Hi, can the code run probably?
I can run it, but it doesn't work,Did you solve it?
I can run it, but it doesn't work,Did you solve it?
No, it does not work for me either.
I also found this problem. I'm try to solve it .
I am also looking forward to this code being updated.
I have tried several modifications, such as applying deeper MLPs with patch outputs and positional embedding, they didn't work.
outputs :
patches are the same. are there any suggestions? it would be very grateful.
can't converge at all
Hi, it seems there is a bug in the L2 Attention module.
The weights of the projection matrices for query and key in L2 Attention should be tied (same weights). Otherwise, the Lipschitzness cannot be guaranteed. This is missed in the code.
Hope it could help. :-)
Hi, it seems there is a bug in the L2 Attention module.
The weights of the projection matrices for query and key in L2 Attention should be tied (same weights). Otherwise, the Lipschitzness cannot be guaranteed. This is missed in the code.
Hope it could help. :-)
Excuse me, big shots, did the code run successfully? How to run it?