devika
devika copied to clipboard
[ISSUE] stuck at failing to decode JSON with llama3 8b
Describe your issue
the json parsing of the response is failing
How To Reproduce
Steps to reproduce the behavior (example):
- create a chat (I am using llama3-8b from Groq)
- enter a prompt
- watch it fail
Expected behavior
to continue
Screenshots and logs
24.05.03 09:32:56: root: INFO : SOCKET info MESSAGE: {'type': 'error', 'message': 'Failed to parse response as JSON'} Invalid response from the model, I'm trying again... 24.05.03 09:32:56: root: INFO : SOCKET info MESSAGE: {'type': 'warning', 'message': 'Invalid response from the model, trying again...'}
Configuration
- OS: Linux
- Python version: 3.10.12
- Node version: 0.0.0 (I have only Bun)
- bun version: 1.1.6
- search engine: custom - Tavily but using it on the Bing profile
- Model: llama3-8b-8192
Additional context
it worked before the last PR #522
what's your prompt? I tested with llama3-8b via groq and ollama before pushing
I am not home but I told llama3 to create the best Discord bot for a project I am working on
Same with llama2
Is it solved?
I use llama3 on ollama
and my config is