Tom-Neverwinter
Tom-Neverwinter
solved: set flash size to 4MB (FS;2MB OTA:~1019KB
yeah it was fun if you had a lot of memory to set it to like 10,000 new tokens and it would write really long paragraphs now its a laughable...
https://www.youtube.com/watch?v=ksBWKa_30Hc
had this error then closed and re-opened, it took the file and processed it with no problem? "Error transcribing file on line CUDA failed with error out of memory" https://pastebin.com/wRVCpcep...
so its not just me. https://github.com/kwaroran/RisuAI/issues/582 seems like a lot of the api are broken
accepted both terms of service for the stated models and added read token then it gives an error
is this with or without a lora. as with a lora is a often posted item: https://github.com/lllyasviel/stable-diffusion-webui-forge/issues/1177
flux1Dev_v10.safetensors that is definitly going to run out of vram and crash `Full flux-dev NF4` `flux1-dev-Q4_0.gguf` currently there is a memory leak at this time. however it should load with...
might have to re-install and move items over. can you provide a log, maybe more information will reveal an answer
please supply more information. [trying to assist in cleaning up the mass of issues atm. welfare. didnt know it had a svd option]