The Evil Man
The Evil Man
Also happens on API 35.
> Here is working solution: > > ``` > adb shell > pm uninstall com.google.android.art > ``` > > and reboot your phone! Cool! Thanks for sharing. Gonna try it...
> Use the regular load lora node. Hi, thanks. I tried it but it did not bring any effects. Probably the training didn't take effect or smth.
> Hi, I got the same error. Did you fix it? Yes, the API seemed changed, but it still didn't work well for me.
> > > Hi, I got the same error. Did you fix it? > > > > > > Yes, the API seemed changed, but it still didn't work well...
Sure, where can I send it?
@wailovet I can send via e-mail.
@wailovet Hi, mail sent, pls check!
> I haven't found a way to directly load diffusers lora in ComfyUI. However, you can use a script for conversion. https://github.com/huggingface/diffusers/blob/main/scripts/convert_diffusers_sdxl_lora_to_webui.py > > ``` > python convert_diffusers_sdxl_lora_to_webui.py --input_lora ./lora_sample.safetensors...
About vLLM support: Hello, there! I am trying to use FlashInfer with vLLM (Torch 2.4.0, CUDA 12.6) self-compiled. vLLM complains about `.../torch/_library/infer_schema.py` ``` ValueError: infer_schema(func): Parameter bitorder has an unsupported...