peft
peft copied to clipboard
Add add_weighted_adapter to IA3 adapters
(Partially) Resolves: https://github.com/huggingface/peft/issues/1688 See https://github.com/huggingface/peft/pull/980 for context.
What
$(IA)^3$ adapters can't be combined. This option, however, is available for other PEFT adapters such as LoRA.
Solution
Implement add_weighted_adapter method for $(IA)^3$ models supporting only weighted average of adapters for now.
@alexrs Thanks a lot for reviving the feature and updating it so quickly. LMK when you think it's ready for review.
@Abdullah-kwl If you could give this a try, that would be great.
@BenjaminBossan Ready for review! 👀
Hey @BenjaminBossan. I made some changes:
- Re-factored the tests to include
_test_weighted_combination_of_adapters_loraand_test_weighted_combination_of_adapters_ia3. These methods are called from_test_weighted_combination_of_adapters. - Added a short section to the docs. Let me know if I should expand it.
With respect to whether we should normalize the adapter weights or not, I'm not sure what is the best approach. On the one hand, I agree that having a combination of adapters with the sum of weights > 1 might not return the best results. On the other hand, I also think it is interesting for users to specify how they want their adapter weights to be normalized. In your example, weight=1/10 is a totally valid approach, but also is taking a softmax of the weights or passing some weights as 0 if we want to do some sort of Top K combination. I'm not sure this method should be in charge of handling all of that.
Maybe we should just specify in the docs that we recommend sum(weights) == 1? We can also print a warning if that condition is not fulfilled.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.