diffusers
diffusers copied to clipboard
[LoRA] use the PyTorch classes wherever needed and start depcrecation cycles
What does this PR do?
Since we have shifted to the peft backend for all things LoRA, there's no need for us to use LoRACompatible* classes now.
We should also start the deprecation cycles for the LoRALinearLayer and LoRAConv2dLayer. This PR does that.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
@yiyixuxu up for another review. Rigorous review appreciated!
cc @BenjaminBossan - we would appreciate it if you can give a review also
@yiyixuxu addressed all your comments. They were very very helpful! Thank you!
Thanks, Benjamin!
As a user, I might not know what to do with that info: Is this feature removed completely or can I still use it, but have to do it differently? Also, I might get the impression that I can still pass scale and it works, it's just deprecated, when in fact the argument doesn't do anything, right? Perhaps the message could be clarified.
Very good point. I clarified that as much as I could.
Moreover, if we already have an idea in which diffusers version this will be removed (hence raise an error), it could be added to the warning. On top, we could add a comment like # TODO remove argument in diffusers X.Y to make it more likely that this will indeed be cleaned up when this version is released.
The depcrecate() utility I am using will automatically take care of that. So, once we hit 1.0.0, on any PR, it will raise an error unless handled accordingly. Here is an example: https://github.com/huggingface/diffusers/pull/6885. Does that work?
Regarding the error message:
deprecation_message = f"Use of
scaleis deprecated. Please remove the argument. Even if you pass it to theforward()of the{attn.__class__.__name__}class, it won't have any effect."
I think it's almost too detailed, users will not normally pass the argument directly to the {attn.__class__.__name__}, right? Instead, the argument was probably passed along by something higher up. I think the message could be shortened to:
deprecation_message = f"The
scaleis deprecated and will be ignored. Please remove it, as passing it will raise an error in the future."
Can we also add a sentence on how to control the scale instead?
The
depcrecate()utility I am using will automatically take care of that. So, once we hit 1.0.0, on any PR, it will raise an error unless handled accordingly. Here is an example: #6885. Does that work?
Cool, I didn't know :+1:
How about?
The scale is deprecated and will be ignored. Please remove it, as passing it will raise an error in the future.
scaleshould directly be passed while calling the underlying pipeline component i.e., viacross_attention_kwargs.
Yes, that sounds good, as it clarifies to the user what they need to do.
@yiyixuxu have resolved your comments, except for https://github.com/huggingface/diffusers/pull/7204/#discussion_r1520259595. Thank you!
Can anyone tell me how to make a scale for LoRA if I can't use scale anymore?
@panxiaoguang scale corresponds to lora_alpha / r, you simply need to make sure to pass your desired lora_alpha and r arguments in LoraConfig
Can anyone tell me how to make a scale for LoRA if I can't use scale anymore?
Apart from what @younesbelkada mentioned (applies to "training" only) you can definitely use cross_attention_kwargs={"scale": ...} and it will all work. https://github.com/huggingface/diffusers/pull/7338 PR will probably clear all the confusion users may have had.