zimenglan

Results 111 comments of zimenglan

> > Hi, @zhangchbin, we have no plan to do this now. But we have updated the arxiv paper to release more details about this large model. > > 请问ViT-L...

Are there any updates on the test set?

hi @Adamdad `full_data_size` represents the `labeled` dataset?

for `consistent_teacher_r50_fpn_coco_720k_fulldata.py` config, if using 8 gpus, `epoch_length * num_gpu * batch_per_gpu` is equal to `7330 * 8 * 5`, which is 293200. but the coco2017 of labeled and unlabeled...

hi @schwarzwalder93 u can see https://github.com/microsoft/SoftTeacher/issues/29. it seems that epoch_leghth * batchsize_of_labeled_data >= full_labeled_data_len

hi @Adamdad i train 'consistent_teacher_r50_fpn_coco_180k_10p.py' config using 4 GPUs, at iteration 16000, the result as below ``` Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.113...

hi @Adamdad should i modify `data.samples_per_gpu` and `data.sampler.train.sample_ratio` to increase the labeled samples for 4 GPUs training and reduce the learning rate? e.g. ``` data.samples_per_gpu=6 data.sampler.train.sample_ratio=[2, 4] lr = 0.01...

hi @Adamdad ![image](https://github.com/Adamdad/ConsistentTeacher/assets/13795888/1cc57884-d8e3-4b02-80aa-c2c8959a2e7f) it seems that this config file is not found ..

> We only has a file called `configs/consistent-teacher/consistent_teacher_r50_fpn_coco_720k_fulldata.py`. No config provided for 360k training for full data. thanks, another question, where to find `configs/consistent-teacher/base.py` file?

hi @Adamdad two questions here: 1. from the log file, learning rate is keeping the same through the training phrase, why? 2. what is the difference between the 36w and...