priyarana
priyarana
Thanks for the reply. I was going through the following blog, which talks about two types of augmentations, https://wandb.ai/authors/scl/reports/Improving-Image-Classifiers-With-Supervised-Contrastive-Learning--VmlldzoxMzQwNzE I am yet to read the whole paper, so not sure...
I'll see how results behave with these two different approaches. Thank you for the insights.
I see the paper uses a batch size of 256, I am not sure if my GPU is going to afford that. Do you think the method works well with...
Sure, thanks. I am hoping to get better results than standard supervised learning which uses only cross-entropy.
Hi . My loss dropped down to -0.17, if I train it further loss starts increasing. Shall i consider -0.17 as convergence point then. Any inputs please.
Actually this is not an issue ! but this is how WGAN gets trained. During training, the loss value keep on dropping upto a certain point when it starts rising....
Following paper has implemented WGAN -div and compared with other WGANs. https://www.nature.com/articles/s41598-022-22882-x, Refer to supplementary draft as well. Paper also explains the implementation of WGAN.