OneTrainer icon indicating copy to clipboard operation
OneTrainer copied to clipboard

[Feat]: LoRA - Advanced Layer Filter

Open bananasss00 opened this issue 5 months ago • 1 comments

Describe your use-case.

Flux has layers named single_transformer_blocks.* and transformer_blocks.*.

If I want to train only the transformer_blocks.* layers but exclude single_transformer_blocks.*, the current filtering method won't suffice.

What would you like to see as a solution?

Perhaps we could add a feature where if a layer name starts with the symbol '^', it checks whether the string starts with that name, rather than checking for its presence within the string. Here is a possible implementation:

def __create_modules(self, orig_module: nn.Module | None) -> dict[str, PeftBase]:
        lora_modules = {}

        if orig_module is not None:
            for name, child_module in orig_module.named_modules():
                if len(self.module_filter) == 0 or any([name.startswith(x[1:]) if x.startswith('^') else x in name for x in self.module_filter]):
                    if isinstance(child_module, Linear) or \
                       isinstance(child_module, Conv2d):
                        lora_modules[name] = self.klass(self.prefix + "_" + name, child_module, *self.additional_args, **self.additional_kwargs)

        return lora_modules

Have you considered alternatives? List them here.

No response

bananasss00 avatar Sep 09 '24 17:09 bananasss00