LearningToCountAnything
LearningToCountAnything copied to clipboard
Understanding the ckpts
@victorprad @mahobley 1)could you please help understand why do we have two checkpoitns. localisation & counting.ckpt 2) If we have to train on custom dataset, do we have to use both the ckpts ?
-
during inference which ckpt has to be used localisation or counting ?
-
What is the difference between the
self.CFG["resume_counting_head_path"],and self.resume path
-
The 2 checkpoints were provided as our main result was trained using no localisation loss, so we wanted to provide a checkpoint that has been trained with minimal supervision. The localisation checkpoint is only provided to allow reproducibility of the figures in our work
-
If you are training on a custom dataset I would suggest not providing a checkpoint in the resume path. I would also suggest setting counting_backbone_unfreeze_layers: -1 in the config.
-
The counting checkpoint is used during inference. The localisation is only for visualisation to better understand the learnt features.
-
resume_counting_head_path is used when a frozen backbone is used, it allows for only the head to be saved rather than the whole network to minimise storage space