Rajakumar05032000

Results 21 comments of Rajakumar05032000

for people who are using Groq models, please make sure to add this fix from #193 (which fixes groq client issues, only then it will work) and then try.

> Hello @rajakumar05032000 thank you for your suggestion, once when I tried creating new project in Devika UI also, I am still getting no reply, kindly note the below response,...

> > > Hello @rajakumar05032000 thank you for your suggestion, once when I tried creating new project in Devika UI also, I am still getting no reply, kindly note the...

> > > > > Hello @rajakumar05032000 thank you for your suggestion, once when I tried creating new project in Devika UI also, I am still getting no reply, kindly...

> > create a flappy bird game > > Thank you @rajakumar05032000, after creating a groq key and using it, I am getting reply from Devika for GORQ- llama2 LLM...

I have re-tested with latest code changes and it works smoothly as expected. Can you guys review & merge it. Thanks!

Already raised PR which solves this issue. PR : #193

Already created PR , which solves this issue. PR : #190

> @rajakumar05032000 Would it be possible to list and specify which LLM model is being served from `llama.cpp`? It should be listed/choosable from the drop-down. Currently It just shows LlamaCpp...