FEARTracker icon indicating copy to clipboard operation
FEARTracker copied to clipboard

Hyperparameter tuning

Open goutamyg opened this issue 2 years ago • 2 comments

Hi! Thank you for publishing your code.

Your released code has a section dedicated to configuration files corresponding to different tracker modules https://github.com/PinataFarms/FEARTracker/tree/main/model_training/config

It has parameters/choices related to training and inference (optimizer, learning rate scheduler, penalty_k, window influence, lr to name a few). Can you please suggest which dataset was used to tune these hyperparameter values? Was it fine-tuned using the test-set itself?

Also, I am particularly intrigued by a statement in the paper: "For each epoch, we randomly sample 20,000 images from LaSOT, 120,000 from COCO, 400,000 from YoutubeBB, 320,000 from GOT10k and 310,000 images from the ImageNet dataset". Can you suggest what was the reason behind choosing such a sampling split and not going for uniform sampling?

goutamyg avatar May 17 '23 22:05 goutamyg

Hello, have you trained to reach the accuracy of the author's model

assertdebug avatar Aug 15 '24 15:08 assertdebug

No, I haven't tried re-training the model

goutamyg avatar Aug 16 '24 13:08 goutamyg