devika icon indicating copy to clipboard operation
devika copied to clipboard

Please add LiteLLM to this project

Open Greatz08 opened this issue 1 year ago • 7 comments

This project is pretty great BUT we need more options to use different LLM's.You don't have to worry about creating a solution which supports 100+ LLM easily as LiteLLM is another foss project which is capable of doing this task for you.

Project LiteLLM link - https://github.com/BerriAI/litellm

Adding LiteLLM will be big win for the project as many will be easily able to use many more LLM easily which everyone wants and project will require 3 major parameters from user like base url,model name,api key that's all and with open ai api general structure it can query and give back result for the query.Many big projects have started adding support for this project in there project to make things advanced in easier way so study it and after that if you have any query you can ask them they are pretty responsive plus if u want to know more about my personal experience of using it with other great projects like flowise then I can tell you that too in detail.

Greatz08 avatar Apr 04 '24 03:04 Greatz08

Sounds great

RohitX0X avatar Apr 04 '24 06:04 RohitX0X

Ollama already provides an OpenAI compatible API. Why bother with litellm?

phalexo avatar Apr 04 '24 19:04 phalexo

@phalexo Remember ollama use litellm and not litellm use ollama. Problem with ollama only was that in ollama you have to download and then run the heavy model on your system and then you can use its base url in other project like this one which will act as llm source to generate response but in case of litellm we can use any kind of model be it close source like open ai,claud or Gemini or open source model running with public api like groq which is providing mixtral,Gemma,llama or running locally by ourself with ollama so it solves the issue of running multiple types of llm with single api structure which is very convenient and easy to use and that's why its needed in this project.

Greatz08 avatar Apr 05 '24 04:04 Greatz08

We need this LiteLLM.

yf007 avatar Apr 08 '24 06:04 yf007

This would really take Devika to the next level. Unlocking so many available models would be a huge gain in capability.

cwallace avatar Apr 20 '24 05:04 cwallace

I support. The lack of LiteLLM support is a big minus.

lehcode avatar Apr 30 '24 00:04 lehcode