UnsupervisedDeepLearning-Pytorch icon indicating copy to clipboard operation
UnsupervisedDeepLearning-Pytorch copied to clipboard

The results of different runs differ a lot.

Open klovbe opened this issue 6 years ago • 6 comments

I am implementing this code on my biological datasets. When I run the code multiple times, it produces quite different results. The ARIs vary from 0.4 to 0.8. I find out that the pretrained model matters a lot to the results. How can I achieve robust results by getting a good pretrained model? Is the pretrained denoising autoencoder important to stacked autoencoder when the structure of the dae is not exactly the same as that of sdae?

klovbe avatar Apr 24 '18 11:04 klovbe

What code are you using? stacked denoising autoencoder?

eelxpeng avatar Apr 25 '18 12:04 eelxpeng

Sorry about the ambiguity! When I run the code VaDe multiple times, it produces quite different results. The ARIs vary from 0.4 to 0.8. I find out that the pretrained sdae model results matters a lot to the VaDe clustering results. How can I achieve robust results by getting a good pretrained sdae model? Is the pretrained denoising autoencoder important to sdae when the structure of the dae is not exactly the same as that of sdae?

klovbe avatar Apr 26 '18 04:04 klovbe

This is actually the issue of the original paper and their released code. I also find this issue annoying. But I find that using xavier initialization for the weights could alleviate the problem a lot. You could try.

eelxpeng avatar Apr 26 '18 04:04 eelxpeng

How to get the pretrained_vade-3layer.pt ?

chulaihunde avatar Jun 03 '18 13:06 chulaihunde

@chulaihunde I converted from original VaDE page. But I really don't know how they get the pretrained weights. Let me know if you figured that out.

eelxpeng avatar Jun 03 '18 22:06 eelxpeng

May be able to refer to https://github.com/piiswrong/dec/blob/master/dec/pretrain.py

chulaihunde avatar Jun 07 '18 00:06 chulaihunde