GLEE icon indicating copy to clipboard operation
GLEE copied to clipboard

The training code

Open liuxingbin opened this issue 11 months ago • 8 comments

Hi, Thanks for the solid work. Could you let me know when you'll release the training code?

liuxingbin avatar Mar 08 '24 07:03 liuxingbin

Hi~ Thanks for your attention! The training code and models will be released in this week!

wjf5203 avatar Mar 18 '24 06:03 wjf5203

Thanks for your reply. I am looking forward to the training code. Best,

在 2024-03-18 14:34:45,"Junfeng Wu" @.***> 写道:

Hi~ Thanks for your attention! The training code and models will be released in this week!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

liuxingbin avatar Mar 18 '24 08:03 liuxingbin

Hi~Thank you for your patience. The training code and image-level inference script have been released. I will continue to update the inference script for other results in the paper.

wjf5203 avatar Mar 19 '24 02:03 wjf5203

Hi~ Thanks for your attention! The training code and models will be released in this week!

Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?

liuxingbin avatar Mar 19 '24 03:03 liuxingbin

Hi~ Thanks for your attention! The training code and models will be released in this week!

Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?

I'm glad to hear that you are trying to use GLEE to finetune your own dataset. For the newly added data, it should first be processed into the standard COCO Detection or RefCOCO format. Then, you can refer to the builtin.py file, which contains numerous datasets newly registered on top of Detectron2 that can be used as references. Additionally, for the newly incorporated data, you need to define its task name and update several places in the code with the corresponding category names, the number of categories here, here and here, as well as the newly added denoising embedding. It's best to find an existing dataset similar to your own and check its routing path, and then set up a config. I recommend loading the -joint.pth weights for fine-tuning.

wjf5203 avatar Mar 21 '24 03:03 wjf5203

@wjf5203 Thanks for the solid work. When do you plan to release finetune scripts?

muengsuaengsuai avatar Apr 22 '24 03:04 muengsuaengsuai

is there a benchmark for training efficiency? thanks.

Hiutin avatar Apr 23 '24 05:04 Hiutin