Information about the epochs in code and in paper
Hello,
Thanks for your work. I was reading the D-FINE paper, and it is mentioned that D-FINE-L was trained for 72 epochs, but in the code, I found this https://github.com/Peterande/D-FINE/blob/4a1f73a8bcfac736a88abde9596d87f116d780a7/configs/dfine/dfine_hgnetv2_l_coco.yml#L34-L44
What does the n mean in 72+2n?
I also found that in D-FINE-S you have a different configuration https://github.com/Peterande/D-FINE/blob/4a1f73a8bcfac736a88abde9596d87f116d780a7/configs/dfine/dfine_hgnetv2_s_coco.yml#L51-L61
Why do you use 120+4n instead of 120+2n here?
Finally, in RT-DETR, they use a base_size_repeat of 3 for all models, but you changed it depending on the model size, and for D-FINE-S, you used a value of 20. How did you find out these values>