johndev168
johndev168
**LocalAI version:** 3.6.0 with latest commit being #6409 **Environment, CPU architecture, OS, and Version:** Mac T8132 MacOS 15.5 24F74 no VM **Describe the bug** The bug causes the model to...
**LocalAI version:** 3.6.0 with latest commit being #6438 **Environment, CPU architecture, OS, and Version:** Mac T8132 MacOS 15.5 24F74 no VM **Describe the bug** When trying to load certain models...
**LocalAI version:** 3.5.0 **Environment, CPU architecture, OS, and Version:** Mac T8132 MacOS 15.5 24F74 no VM **Describe the bug** Loading a model with MLX backend does not work. It refuses...
**LocalAI version:** 3.5.0 **Environment, CPU architecture, OS, and Version:** Mac T8132 MacOS 15.5 24F74 no VM **Describe the bug** When editing the model via ui you can specify a system...
**LocalAI version:** 3.5.0 **Environment, CPU architecture, OS, and Version:** Mac T8132 MacOS 15.5 24F74 no VM **Describe the bug** When asking localai to answer in formatted html it does not...
**Describe the bug** When ingesting files with ollama I do get spammed with an error telling me that ollama cannot decode batches. After doing some research the underlying issue seems...