peft icon indicating copy to clipboard operation
peft copied to clipboard

PEFT implementations of adapters are outdated and languishing

Open bghira opened this issue 1 year ago • 7 comments

System Info

Latest PEFT, transformers, diffusers codebase.

Who can help?

@benjaminbossan @sayakpaul

Information

  • [X] The official example scripts
  • [ ] My own modified scripts

Tasks

  • [X] An officially supported task in the examples folder
  • [ ] My own task or dataset (give details below)

Reproduction

The LoKr, LoHa, and other LyCORIS modules are all outdated and missing substantial improvements and fixes from their upstream implementations.

Expected behavior

Either the upstream implementation should be directly wrapped, or these implementations should be kept up to date by the PEFT team. It doesn't seem like it should be on KohakuBlueleaf to update support for his adapters since the code was hastily copy-pasted into PEFT before being abandoned.

bghira avatar Jul 16 '24 16:07 bghira

We're currently in a bit dev starved when it comes PEFT (basically just me), which will hopefully be resolved soon. Could you give an example of what is broken or missing in LoKr and LoHa? Wrapping probably won't be an option, as the implementations differ too much and we have different requirements in PEFT.

BenjaminBossan avatar Jul 16 '24 16:07 BenjaminBossan

full matrix tuning, 1x1 convolutions, quantised LoHa/LoKr, weight-decomposed LoHa/LoKr, fixed rank dropout implementation, fixed maths (not multiplying against the vector, but only the scalar)

bghira avatar Jul 16 '24 16:07 bghira

basically the results are entirely different from upstream to PEFT implementation, with the upstream author recommending peft be avoided at this point in time :[

bghira avatar Jul 16 '24 16:07 bghira

Thanks for the info. I reformatted this and put what sounds like bugs to the top, as they are more important. If you have any further references, like GH issues, that would be fantastic.

  • [ ] fixed rank dropout implementation
  • [ ] fixed maths (not multiplying against the vector, but only the scalar)
  • [ ] full matrix tuning
  • [ ] 1x1 convolutions
  • [ ] quantised LoHa/LoKr
  • [ ] weight-decomposed LoHa/LoKr

BenjaminBossan avatar Jul 16 '24 16:07 BenjaminBossan

Thanks for making us aware of this, @bghira!

@BenjaminBossan WDYT about also opening this to the community in case someone is interested in collaborating to start with some of these?

sayakpaul avatar Jul 17 '24 02:07 sayakpaul

Yes, absolutely @sayakpaul. Do you have a suggestion how to best promote this?

BenjaminBossan avatar Jul 17 '24 08:07 BenjaminBossan

@BenjaminBossan https://github.com/huggingface/peft/issues/1935

sayakpaul avatar Jul 18 '24 04:07 sayakpaul

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar Aug 16 '24 15:08 github-actions[bot]

no stalebot. bad

bghira avatar Aug 16 '24 15:08 bghira

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar Sep 11 '24 15:09 github-actions[bot]