GaitMixer icon indicating copy to clipboard operation
GaitMixer copied to clipboard

Problems with training

Open Luo-Ji-X opened this issue 3 years ago • 3 comments

Dear the authors: I downloaded your code and trained it with 2 GTX1080Ti according to your readme document. But the training effect is much worse. Is there anything wrong with me? Attached is the training results.

           0        18        36        54  ...    144    162    180      mean

NM#5-6 0.937374 0.935354 0.948000 0.942424 ... 0.950 0.948 0.912 0.939596 BG#1-2 0.809091 0.829293 0.826263 0.835714 ... 0.816 0.840 0.787 0.811113 CL#1-2 0.780000 0.810000 0.819000 0.806122 ... 0.827 0.817 0.800 0.808069

Luo-Ji-X avatar Nov 17 '22 12:11 Luo-Ji-X

It's high variance due to the small data regime and noise from pose estimation. Especially, the "Coat" condition has the highest variance. So we run 8 experiments for each architecture and report the best results. We also provide the weights of the best result. You can find it here

exitudio avatar Dec 02 '22 21:12 exitudio

It's high variance due to the small data regime and noise from pose estimation. Especially, the "Coat" condition has the highest variance. So we run 8 experiments for each architecture and report the best results. We also provide the weights of the best result. You can find it here

Did you use hyperparameter as same as the common.py setting default when running experiments, I really confused about the result I trained on a V-100. The mean accuracy is NM#5-6: 0.8499, BG#1-2: 0.7028, CL#1-2:0.6968

Tamako888 avatar Jun 01 '24 13:06 Tamako888

Yes. We use the same default parameters as common.py. For BG and CL, the evaluation has very high variance but NM should not see much difference. Can you run an evaluation on our pre-train model? Does it get a similar result as reported?

exitudio avatar Jun 05 '24 16:06 exitudio