tapnet icon indicating copy to clipboard operation
tapnet copied to clipboard

TAPIR Checkpoint for Training/Finetuning

Open chandlj opened this issue 1 year ago • 4 comments

In the previous iteration of TAP-Net, there was a checkpoint file released that had not only model state but also the optimizer state as well as the global_step. This was helpful since you could load this in directly into an experiment and easily start finetuning. However, I don't believe that there is a similar checkpoint file for TAPIR. In the README there is a checkpoint for the "online" version of the model, and in one of the linked notebooks there is a checkpoint for the offline model, but neither includes training state.

Could you release a checkpoint with the training state included for the TAPIR model?

chandlj avatar Jun 28 '23 03:06 chandlj

I wasn't even aware that the optimizer state was included in the TAPNet file. Even so, I'm not sure it's that useful when training on a new dataset, as we would have decayed the learning rate down to 0, and I would expect gradient statistics to be different on a new dataset. Is it possible to just re-initialize the optimizer state and use learning rate warmup? Do you have evidence that TAP-Net finetuning is harder without the optimizer state?

It would be a non-trivial amount of work to add this to our checkpoints, but I agree that finetuning is a good use-case. We may consider releasing this if there's strong evidence that it helps finetuning.

cdoersch avatar Jun 30 '23 16:06 cdoersch

@chandlj - were you able to finetune on your data?

vrk7 avatar Jul 30 '23 16:07 vrk7

@vrk7 We tried fine-tuning on some self-generated DAVIS labels but did not see a performance improvement with some very limited testing. We haven't completely ruled it out as beneficial but we decided to not invest much more time into it for the time being

chandlj avatar Aug 02 '23 16:08 chandlj

@chandlj If possible, can you please give me the Colab file to it?

vrk7 avatar Aug 02 '23 16:08 vrk7