Peter Krenesky
Peter Krenesky
Artifacts now refresh as of #44 . Tasks still need streaming support.
`MissingCommandMarkers` indicates `ParseJSON` couldn't find json in the response. This is most often caused when the bot doesn't know who to delegate to or how to respond. If you give...
This should be resolved in v0.3.: - `@code` and `@ix` agents were both reworked to use OpenAI functions. This eliminates most JSON parse failures. - `@ix` now responds general inquiries...
A basic test shows the model can understand and respond in korean. I don't speak korean so I used google translate.  > DEBUG Moderator returned response=안녕하세요! 어떻게 도와드릴까요? The...
After further investigation the issue is not related to language used. I consistently replicated this error with english characters. **Steps to replicate:** 1. Send a message 2. Send a second...
I think the fix you made is the right one. The chat input always sends `user_input`. The expected values for the chain can be edited but the chat doesn't map...
Here are related docs for adding memory to the OpenAIFunctionsAgent. https://python.langchain.com/docs/modules/agents/how_to/add_memory_openai_functions The example in the docs use a `MessagesPlaceholder` to add the memory variable to the agent's template. Need to...
Can you expand on what feature you'd like?
### local LLMs Two options for local models are available: - ollama - llama-cpp ### OpenAI Proxy Open AI proxies also work by setting the proxy url and/or api base....
> ah, all already there. > > so this server-tab is in the settings, I assume? settings it never opens for me. It's set individually for each LLM node. 1....