AdaptFormer icon indicating copy to clipboard operation
AdaptFormer copied to clipboard

Some questions.

Open WentaoTan opened this issue 3 years ago • 1 comments

It's been a very interesting job and it has inspired me a lot, thank you for sharing this paper! After reading, I have some doubts. Can you help me answer my doubts? Thank you very much! (1)Table 2c gives an ablation experiment on scale factor s, but the article does not mention why this s is used. Because the residual forms I have seen elsewhere are all directly additive, which is equivalent to s equal to 1, why use s here? (2)Why is the performance of full-tuning inferior to AdaptFormer in video experiments (Table 1)? In other words, what properties of AdaptFormer make it have such good performance.

Thank you!

WentaoTan avatar Jun 08 '22 09:06 WentaoTan

Hi, thanks for your interest and questions.

(1) The motivation behind the scaling factor s is to balance the original frozen feature and the newly updated feature when combining them with the element-wise sum. This design is inspired by the LoRA and Scaled Parallel Adapter.

(2) For the strictly fair comparison, we use the same training recipe for all fine-tuning methods in our experiments. The recipe mainly follows the linear probe settings as described in Appendix, which doesn't include some strong regularizations (e.g., mixup or cutmix). However, it has been studied that training vision transformer usually needs many regularizations. Therefore, the inferior results of full-tuning might result from the inappropriate training recipe of the linear probe, which is suitable for tuning limited parameters (linear probe and AdaptFormer) but not for full parameter tuning.

We will conduct the experimental comparisons (Full tuning and AdaptFormer) involving the above regularizations in the near future. Thanks again for the question.

ShoufaChen avatar Jun 08 '22 12:06 ShoufaChen

We're closing this issue since no further questions.

Please feel free to reopen it if.

ShoufaChen avatar Sep 23 '22 01:09 ShoufaChen