Md Zuhair
Md Zuhair
What exactly did you change?
> The code prompt also has problems with LLAMA and some other models. Any way to improve that? What issue exactly did you face?
I'm guessing there might be some API key in bedrock too for claude? That doesn't work right?
It'd help if you show the logs rather than the API keys. I'd also suggest to remove the API keys as you have shown the OpenAI API key and this...
Why don't I see your message in the logs? The "Hi.... " one
What I feel is it's llama2's inability to follow the prompt and give the correct format response. Try with another LLM, like Claude3 (for which you get free 5$ initially)...
Does this happen even after re-running devika.py?
Umm what i feel is it at first didn't work because of the wrong response from the LLM on the prompts which are there. Because it shows "Invalid response from...
Fix white ui issues: https://youtu.be/3ULryo20mdc?si=5UoES_wVMxQTdrSq
Here is a video of the same (here although error was hardcoded and browser opening was hardcoded just to show) [demo.webm](https://github.com/stitionai/devika/assets/62602902/3617cfb6-0991-4dc9-91b9-9069c10b3810)