[Feature Request] Unable to Unload or Swap adapters at runtime
Currently, adapters can be loaded with:
model.load_weights('/path/to/weights', strict=False)
However, there is no way to either unload the weights:
model.unload_weights()
or to swap in a new one dynamically:
model.swap_weights('/path/to/other/weights')
This is useful for several use cases, including DPO loss calculation & Dynamically serving LoRA Adapters
You can swap adapters like so:
model.load_weights("adapters1.safetensors"), strict=False)
# Swap to new adapters
model.load_weights("adapters2.safetensors"), strict=False)
For unloading weights.. it really depends what you want to do. If you just want to delete the entire model.. simply delete it:
del model
Though maybe for that you could say more about the use case..
Sweet. For unloading, what I had in mind was for making the combination of using model.load_weights(..) followed by model.unload_weights(..) an idempotent operation. So, I can run LoRA on a model for a bit (making a new adapter), then, during loss calculations, run some evaluations comparing results from the original model vs. the adaptation without keeping a redundant copy of the original, and continue until the LoRA is completed.