CoCLR
CoCLR copied to clipboard
when can the checkpoints of Kinetics-400 be uploaded.
Maybe later this week or next week, I am still doing some final check. Will let you know in this issue if it's uploaded.
Maybe later this week or next week, I am still doing some final check. Will let you know in this issue if it's uploaded.
Thanks! Look forward to your update!
Is there any news on checkpoints of k400? Looking forward it as well.
Also waiting here, as I cannot reproduce the results for K400 training.
Same here.
Sorry for the long delay... they have been uploaded now. https://github.com/TengdaHan/CoCLR#pretrained-weights
Thanks for uploading this. Strangely I cannot reproduce this result using your given instructions. I noticed that in your infoNCE training you run 'main_infonce.py' and 'teco_fb_main.py' instead of 'main_nce.py'. Are they the same files?
In fact, I cannot reproduce the result for ucf101 pretraining either. If someone else succeeded in reproducing the result using latest pytorch package please let me know.
@TengdaHan Hi Tengda, thanks for uploading it! I am a little bit confused, is it just the weights of training on k400 only? Or a joint training of k400 firstly and then ucf101? Here is the retrieval performance I got on UCF101 without any training further:
1NN acc = 0.5062 5NN acc = 0.6845 10NN acc = 0.7638 20NN acc = 0.8371 50NN acc = 0.9082
@June01 The weight in "Kinetics400-pretrained models" are self-supervised trained on K400 only. Hmm, it seems your retrieval result here "1NN=0.5062" is better than what I got with the same model last year. But anyway in our paper, we only report NN-retrieval with UCF101-pretrained weights.
@thematrixduo filename - sorry, they are the same file, I changed the name. What accuracy did you get when reproducing? If UCF101-RGB finetune is about 86% - 88% I think it's acceptable. Our 90+ result is obtained by fusing two-stream predictions.
@
@June01 The weight in "Kinetics400-pretrained models" are self-supervised trained on K400 only. Hmm, it seems your retrieval result here "1NN=0.5062" is better than what I got with the same model last year. But anyway in our paper, we only report NN-retrieval with UCF101-pretrained weights.
@thematrixduo filename - sorry, they are the same file, I changed the name. What accuracy did you get when reproducing? If UCF101-RGB finetune is about 86% - 88% I think it's acceptable. Our 90+ result is obtained by fusing two-stream predictions.
Thanks for the reply. I can only get 82% for K400-Pretraining, and only 78% for UCF101-Pretraining (2-Cycles). For UCF101-pretraining I used your uploaded lmdb data, and strictly followed the instructions given here.
Did you pretrain any other architectures like R(2+1)D-18 on Kinetics 400.
@fmthoker No, we didn't. We only used S3D backbone in our experiment.
@thematrixduo I updated the code that fixed an issue that might reduce the training efficiency of the co-training stage. Maybe it's related to UCF101 reproduction: https://github.com/TengdaHan/CoCLR/issues/43