Huy Truong
Huy Truong
**LocalAI version:** 1.30.0 Latest. **Environment, CPU architecture, OS, and Version:** Window server 2022. Xeon E5 2670v2. GPU Geforece GTX 1070 **Describe the bug** LocalAI using CPU instead of GPU. CUDA...
**Describe the bug** I'm using flowise and generate html embeded. I tried to insert the code into one page of HTML. Try to chat but response is error.  Log...
**Is your feature request related to a problem? Please describe.** It could be Problem. I used docker desktop and when rebuild Local AI, LocalAI use only one cpu 100% while...
**LocalAI version:** **Environment, CPU architecture, OS, and Version:** I'm using Docker Desktop for LocalAI, Ubuntu WSL. **Describe the bug** I'm Using model LocalAI-llama3-8b-function-call-v0.2 with Conventional Retrieval QA in flowise, with...
### Describe the bug I'm using faiss with localai embedding with 🦙 llama.cpp Python API With old version of flowise, faiss with similar search working well. However, after upgrade to...