Daniel LaLiberte
Daniel LaLiberte
I wonder if there is just a misunderstanding going on here. Given my limited knowledge of git, I may be missing something, but a few things are clear to me,...
I'm seeing these errors too, since I am running the webtest before adding a new test for the semantic checks. I think I see how we can clean up most...
Here is what it looks like so far: 
> Recording of the autogenui. Main steps The op issue is about how to run autogen-ui with a local llm service, not how to run autogen-ui locally. I don't see...
I am seeing this problem, using the latest version of AnythingLLM (0.2.0?). I saw it when using LM Studio, but then it seemed to clear up on its own, or...
> Is it because there is an issue with streaming or because certain models do not support it? With Kobold, I was seeing the whole stream of tokens being generated,...
> consider using PowerShell version 7 or later, as it might have better support for handling different archive formats. I just downloaded PSVersion 7.4, the latest. (Then you have to...
Got it working, thanks. Next issue was, as I recall (it was many hours ago lol), the openai api support is not very complete for my needs.
Thanks for your extensive response. It affirms that you do care about the UI and ease of use. Btw, I think the AnythingLLM UI is mostly great, but this particular...
Here is what I experienced regarding the default settings, which I had posted to the discord. I installed (cloned and built) the AnythingLLM repo for development. I am on a...