mmpose icon indicating copy to clipboard operation
mmpose copied to clipboard

[docs] Perform Finetuning on custom dataset (freeze-partial-parameters-during-training)

Open CerPhd opened this issue 1 year ago • 0 comments

📚 The doc issue

Hello everyone, there is no error in the documentation, but I do not understand how to fine tune a pose estimator in MMpose. More specifically, I want to do a fine tuning of the last layers of 'rtmpose-x_8xb256-700e_body8-halpe26-384x288' on a rather small dataset with different annotations compared to COCO halpe etc. So what I need to do is:

  • Create a suitable custom dataset, done.
  • Change the channel numbers of the head output (RTMCChead), done.
  • Perform the training of the last layers of the selected rtmpose model. Here I do not understand where and how to execute the 'freeze-parameters-during-training' function. That is, I do not want to retrain the whole head. The first thing I can not figure out is how to get a definitive list of all the blocks that make up the head. The second question is how I should call these blocks when I use the optim_wrapper for the head block. Example " optim_wrapper = dict( optimiser=dict(...), paramwise_cfg=dict( custom_keys={ 'head.alayerofRTMCChead': dict(lr_mult=0, decay_mult=0), 'head.gau': dict(lr_mult=0, decay_mult=0), })) Thank you for your support!

Suggest a potential alternative/fix

No response

CerPhd avatar Jan 26 '24 17:01 CerPhd