Fails to parse ollama model code output
Describe the bug
Whenever I instruct Devika to program anything, the initial stages of the process go well, but once it gets to the programming part, the agent fails to parse the models output correctly: Invalid response from the model, trying again...
All other parts of the agent are functional, including web browser, searching, and search result parsing.
Desktop (please complete the following information):
- OS: WSL (Ubuntu 22.04) - Windows 11 Pro
- Browser: Firefox
- Version:
commit edd99a7fd112173436f3b8e3bcdadb1e31a961eb (HEAD -> main, origin/main, origin/HEAD)
Author: ayush rajgor <[email protected]>
Date: Fri Apr 5 00:24:21 2024 +0530
Fix: setting page loading issue, messages from user (#359)
* fix: HMR error
* fix: second message from not showing/ remove:console.log
resolved #359
Additional context I've attached the console log file (a txt of console output) here. I can provide any additional logs.
#347 and #307 are related, and this is likely a duplicate, although an important issue for local models.
what did u use to LLM model ?
It is mentioned in the log file.
duplicate #347