Zero
Zero
> > Maybe will be possible in the future or only ppl with 3090's & 4090's GPU's can train Loras for FLUX? > > Take a look at [fluxgym](https://github.com/cocktailpeanut/fluxgym/) 
> so I reverted to an earlier commit. Using fp16 LoRA doesn't help anything. Can you tell me what commit? THX.
So, what is the last commit that works like should?
> Every picture I create without Lora is prefectly crisp using models: NF4, FP8, FP16, Q8, and DEV > > As soon as I add a LORA they all become...
Again... 
> I have the same problem since last week. I have tried to use several different workflows but it keeps disconnect all the time. Ok, so we need to find...
> Maybe do exactly what is written there, try this version https://github.com/lllyasviel/stable-diffusion-webui-forge/releases/download/latest/webui_forge_cu121_torch231.7z But what folder need to replaced exaclty?
https://github.com/lllyasviel/stable-diffusion-webui-forge/issues/1566
NVIDIA 3090.... 
> Hi. The thing is that I can't update, this is for the ArgosTranslate developers (https://github.com/argosopentech/argos-translate). As an option, you can look here, there was already a similar question [#139](https://github.com/AlekPet/ComfyUI_Custom_Nodes_AlekPet/issues/139)...