Results 12 comments of Sheehan

I didn't find this at the beginning and have tried both these two. This indeed affects the update step but luckily not much on the result.

This issue should because you have an old version of stable diffusion in your path.

Hi, I haven't tried DinoV2, but I believe it works on other types of ViT as long as it has a similar architecture. Dino mainly just changed the training objective....

Hi, thanks again for interests. What seeds are you using for these results? I just checked, my results are: For FER2013, the acc for seed 0,1,2 are 50.878, 50.655, and...

Yes. It is normal as we do 5-shot training and the samples are different for each seed. That's why we took the average for three seeds across all 20 datasets...

Hi, thanks for the interests! The setting is inherited from LoRA's official development code: https://github.com/microsoft/LoRA/tree/snapshot-9-15-2021 https://github.com/microsoft/LoRA/blob/snapshot-9-15-2021/src/model.py

Hi, many thanks for interests! The scaling factor is a hyper-parameter, you can manually adjust it but from my experience it won't affect the performance much. For the value matrix,...

Hi, thanks for the interests! I just been notified you raised the same question in the elevater toolkit https://github.com/Computer-Vision-in-the-Wild/Elevater_Toolkit_IC. So basically the best results on the test set are reported....

Hi, many thanks for your interests. They can be found in the repo here [configs/trainers/CoCoOp/vit_b16_c4_ep10_batch1_ctxv1.yaml](https://github.com/KaiyangZhou/CoOp/blob/main/configs/trainers/CoCoOp/vit_b16_c4_ep10_batch1_ctxv1.yaml)

Yes. But CoCoop only has classification. Also specify COCOOPCF for CPL. Batch size may be adjusted in some experiments. we tested smaller ones on some machines for memory issues, this...