GLEE
GLEE copied to clipboard
The training code
Hi, Thanks for the solid work. Could you let me know when you'll release the training code?
Hi~ Thanks for your attention! The training code and models will be released in this week!
Thanks for your reply. I am looking forward to the training code. Best,
在 2024-03-18 14:34:45,"Junfeng Wu" @.***> 写道:
Hi~ Thanks for your attention! The training code and models will be released in this week!
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Hi~Thank you for your patience. The training code and image-level inference script have been released. I will continue to update the inference script for other results in the paper.
Hi~ Thanks for your attention! The training code and models will be released in this week!
Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?
Hi~ Thanks for your attention! The training code and models will be released in this week!
Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?
I'm glad to hear that you are trying to use GLEE to finetune your own dataset. For the newly added data, it should first be processed into the standard COCO Detection or RefCOCO format. Then, you can refer to the builtin.py file, which contains numerous datasets newly registered on top of Detectron2 that can be used as references. Additionally, for the newly incorporated data, you need to define its task name and update several places in the code with the corresponding category names, the number of categories here, here and here, as well as the newly added denoising embedding. It's best to find an existing dataset similar to your own and check its routing path, and then set up a config. I recommend loading the -joint.pth
weights for fine-tuning.
@wjf5203 Thanks for the solid work. When do you plan to release finetune scripts?
is there a benchmark for training efficiency? thanks.