unsloth
unsloth copied to clipboard
[Question] Adding several peft adapters and ensuring unsloth takes them into account
Adding several peft adapters
I would like to add several peft adapters to my LLM. Since unsloth does not override the add_adapter function in PEFT I currently use the following:
config = LoraConfig(
r=self._r,
lora_alpha=self._alpha,
target_modules=target_modules,
lora_dropout=0,
bias="none"
)
unsloth_peft_config = config.to_dict()
del unsloth_peft_config["task_type"]
peft_model = FastLanguageModel.get_peft_model(
llm_module,
**unsloth_peft_config,
use_gradient_checkpointing="unsloth" if not self._use_cache else False
)
peft_model.add_adapter("new_adapter_1", config)
peft_model.add_adapter("new_adapter_2", config)
peft_model.add_adapter("new_adapter_3", config)
I am wondering if it is the correct method to adopt or if there is a better way to do it? Are the new adapters taken into account by unsloth?