Felor
Felor
@neecapp Yes, been testing back and forth but I think I figured it out in my last commit. The forward must be first, so I added an override to always...
Shifted things around @damian0815 I started looking into adding the lora cross attention processor, but adjusting the keys and getting it to properly import the state dictionary was proving to...
> Rewrote the hook version and fixed some issues, got that working very well on my end. > > I have a version that loads cross attention and works fine,...
@neecapp @damian0815 Added a peft setup, it does not work yet, but with a lora trained with peft (https://github.com/huggingface/peft) it will try to use it. The issue is something related...
Just wanted to post an update. With the talk of code freeze with the implementation of nodes. I have paused on doing much here. There also appears to be another...