lora-inference
lora-inference copied to clipboard
No module named 't2i_adapters'
I just want to launch Kohya-ss LoRA inference on a clean GPU server.
Any way i can do this?
I was trying to reverse-engineer your code to just get the inference with LoRA, but it seems like there is no end in filling the empty spaces
I was able to get 0 errors and add LoRA.
But the results are WAAY worse than using LoRA in WebUI.
How to achieve the same quality in Python?
I was able to pull that dependency using
pip install git+https://github.com/cloneofsimo/t2i-adapter-diffusers
HTH