LearningToCountAnything icon indicating copy to clipboard operation
LearningToCountAnything copied to clipboard

Understanding the ckpts

Open jaideep11061982 opened this issue 2 years ago • 1 comments
trafficstars

@victorprad @mahobley 1)could you please help understand why do we have two checkpoitns. localisation & counting.ckpt 2) If we have to train on custom dataset, do we have to use both the ckpts ?

  1. during inference which ckpt has to be used localisation or counting ?

  2. What is the difference between the self.CFG["resume_counting_head_path"], and self.resume path

jaideep11061982 avatar Dec 14 '22 17:12 jaideep11061982

  1. The 2 checkpoints were provided as our main result was trained using no localisation loss, so we wanted to provide a checkpoint that has been trained with minimal supervision. The localisation checkpoint is only provided to allow reproducibility of the figures in our work

  2. If you are training on a custom dataset I would suggest not providing a checkpoint in the resume path. I would also suggest setting counting_backbone_unfreeze_layers: -1 in the config.

  3. The counting checkpoint is used during inference. The localisation is only for visualisation to better understand the learnt features.

  4. resume_counting_head_path is used when a frozen backbone is used, it allows for only the head to be saved rather than the whole network to minimise storage space

mahobley avatar Dec 16 '22 11:12 mahobley