Zacanbot
Zacanbot
You will see that running queries are shown in the STATUS tab with a cancel button next to them.
Koboldcpp says its API is OpenAI compatible. But if I configure LocalAI or LM Studio endpoints to point to Koboldcpp, I get the same truncation experience as the OP. Maybe...
Thank you for adding the Koboldcpp connection options. However, can we re-open the issue? The original truncation issue still persists with the latest version of AnythingLLM using the new Koboldcpp...
I just updated to the latest version (1.64) and it seems to be working correctly now! Thanks for digging into this. Appreciated 👍