Co-DETR
Co-DETR copied to clipboard
freeze most of the layers and train head
I would like to train on a custom dataset, but I want to freeze most of the layers and only train the last few layers (like head or decoder). I hope that the mAP or AP metric can reach the results of training the YOLO model on the same dataset. Is this feasible?
I have yet to explore this configuration. Perhaps you can perform this experiment using your custom dataset to benchmark this setting.