VaDE-pytorch
VaDE-pytorch copied to clipboard
hyper-parameters
Hello,
Thanks for sharing this code. I was wondering if you could also share the hyper-parametrization you have used in order to obtain the training curve on the front page: 94% cluster accuracy after 300 epochs. Running the code out of the box it seems to plateau at around 80%
Thank you,
Best regards,
Miguel
I have the same problem!!!
I just run code it can get the 94%+ accuracy..
I meet the same problem, and just setting --hid_dim 50
can obtain 94%+ acc
I meet the same problem, and just setting
--hid_dim 50
can obtain 94%+ acc
I set the hid_dim as 50, get worse acc, any ideas?
The variance of the method is quite high. There is no way to avoid this. Try to repeat the experiment at least 10 times.
This method relies on the GMM to initialize the network, and you can get the final result by trying a few more times!
@GuHongyang Yes, but that is unfair. Because in practice you simply do not have any label to evaluate your results.
Yes. The core problem is still the algorithm itself. You can divide the training data into a part of the data as validation data, and use this data to find better GMM initialization parameters.
The variance of the method is quite high. There is no way to avoid this. Try to repeat the experiment at least 10 times.
Thank for your comments. I am wondering if the accuracy will be infected if I don't use GPU ( I removed the cuda() part).
Yes. The core problem is still the algorithm itself. You can divide the training data into a part of the data as validation data, and use this data to find better GMM initialization parameters.
Thanks for the advice. I have the same problem, I wonder if the network is correctly initialized. I ran the code for about 10 times with PyTorch 1.7, but it can never achieve 82%+. Is there any difference between different PyTorch versions in network initialization? Besides, can you share the final code in this repository?