Tom-Neverwinter
Tom-Neverwinter
I think flux is picky on the vae it uses. the main one being ae.safetensors
This is an issue I also have comparing it to comfy ui. maybe we can have a more technical response on this when you have time.
adding: https://github.com/lllyasviel/stable-diffusion-webui-forge/issues/1186
seems to resolve the issue. if it has to resume it will fight you but you can complete it. same if it has a error you can run the recheck...
I often find this model repeats regardless of the settings. I assume its not been fixed especially as the model states july a week before additional tokenizer fixes were implemented
doesn't this just mean I need to import my 0 key? which should be applicable for: NAOMI1, NAOMI2, Chihiro, Triforce https://www.arcade-projects.com/threads/how-to-create-a-zero-key-pic-for-net-booting-and-cf-on-naomi-chihiro-and-triforce.6611/ I must have over a hundred compatible arduino chips...
known issue, please use search before submitting
the tool calling looks like its getting added as well https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct/discussions/53
https://www.reddit.com/r/LocalLLaMA/comments/1eg5wgb/llama_31_changed_its_chat_template_again/ and reddit tipped off another change. so lots still going on behind the scenes https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct/tree/main
still getting a weird bug may not actually be related? ``` 21:03:33-121169 INFO Starting Text generation web UI 21:03:33-124170 INFO Loading settings from "settings.yaml" 21:03:33-128170 INFO Loading the extension "Lucid_Vision"...