devika
devika copied to clipboard
Ollama with Devika- No Code created, as knowledge_base_context is null
Description of the bug When using ollama with devika, the knwoledge_base_context dict is null. This causes the prompt to pass this section as null, thereby causing the example template to be rendered as the final code output.
To Reproduce Steps to reproduce the behavior:
- Go to userinterface, create a project and choose llama as the LLM from the top right
- Enter the prompt to generate a calculator or an user registration screen
- Click the send button, wait for devika to browse and ask a question, answer the question and continue.
- But a debug breakpoint in coder.py, line 105 and inspect element.
Expected behavior Assumption is that the knowledge base input is added to the prompt and sent to LLM.
Desktop (please complete the following information):
- OS: Linux
- Browser Firefox
Additional context In the coder/prompt.jinja, replace knowledge_base_context with search_results the value is returned in the variable, but still the code generated is the default code
I dont understand. Are you asking to add knowledge base? Because I dont think currently knowledge_base is used. Neither do we send it to the template prompt.
And why is it only specific to OLlama.
I am new to open source, please ignore if there are silly doubts which I ask but I am really interested in this project and want to work on this! Thanks!
Devika stop after web search? it just search and suddently stop "thinking"
I have tried using Devika to complete run locally without calling any paid APIs. And using local LLMs using Ollama. The prompts had some syntax and json formatting errors, which when corrected, generates the example code stored in jinja. It doesn’t generate any relevant code and when you look at the inference it is sending to LLM, we can see that there is no other data inferred from the web searches but just the initial user prompts and a few queries generated by Devika.
I am currently try to enable knowledge base and trying to make Devika generate separate set of queries for frontend, middleware, backend and database and trying to use multiple LLMs based on the training that was given to them for the above purposes.
There are no silly questions as we all are amazed by the great work done by the creators of this project. I am also trying to learn this and understand its implementation and trying to build on top of this based on my limited knowledge of the subject.
Regards Viv
On Sat, 30 Mar 2024 at 7:48 PM, Md Zuhair @.***> wrote:
I dont understand. Are you asking to add knowledge base? Because I dont think currently knowledge_base is used. Neither do we send it to the template prompt.
And why is it only specific to OLlama.
I am new to open source, please ignore if there are silly doubts which I ask but I am really interested in this project and want to work on this! Thanks!
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/263#issuecomment-2028084534, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOLLASD47BO7ISYLVHPVMLY23CR5AVCNFSM6AAAAABFPOMESKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRYGA4DINJTGQ . You are receiving this because you authored the thread.Message ID: @.***>
Yes,
It goes into a loop after prompting a question to the user and I am unable to get any output based on the question. It does a few searches using DuckDuckGo and shows the web pages. It does throw some exceptions as well. But in the end, the code that is written is the sample code from jinja template.
As of this moment, I was able to get knowledge_base working and was able to see a code being generated in the log file. Still working on getting Devika to generate and run the code.
On Sat, 30 Mar 2024 at 8:00 PM, Nicolas Pereira @.***> wrote:
Devika stop after web search? it just search and suddently stop "thinking"
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/263#issuecomment-2028088553, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOLLATKCTN2COY7XZNLBYDY23EB7AVCNFSM6AAAAABFPOMESKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRYGA4DQNJVGM . You are receiving this because you authored the thread.Message ID: @.***>
Try to use this coder.py https://github.com/hqnicolas/devika/blob/main/src/agents/coder/coder.py
I think it was a problem with new coder.py
is it solved or not? if solved then close the issue
Latest Devika Version was Working with Ollama, I have Freezed it on https://github.com/hqnicolas/devika/
Thanks guys, I think https://github.com/hqnicolas/devika/ is able to generate code, will update once I complete my tests with a mvc approach in planning and codegen