Fabric icon indicating copy to clipboard operation
Fabric copied to clipboard

[FEATURE] Introduce LiteLLM to fabric

Open chrissonntag opened this issue 1 year ago • 2 comments

This PR introduces LiteLLM to fabric. Instead of only the OpenAI API, users are now able to use a huge variety of LLM Providers and Models. The most significant change in this PR is that LiteLLM takes over the inference from the former openai library.

What could be next:

  • Have a better Exception-Handling - to improve the User Experience
  • Extend Setup - Allow defining several environment variables, not just the OPENAI_API_KEY
  • Integrate LiteLLM into the Server Component

chrissonntag avatar Feb 11 '24 10:02 chrissonntag

Oh yes, definitely, we definitely want this. Let's get the clients working super smooth this week and let's look into this. Thank you!

danielmiessler avatar Feb 11 '24 18:02 danielmiessler

Ok, will do. I'll keep you posted 👍🏻

chrissonntag avatar Feb 11 '24 19:02 chrissonntag

Sweet!

lmccay avatar Feb 13 '24 19:02 lmccay

@danielmiessler Did you have time to test the LiteLLM integration and would be willing to merge the PR? There's another feature request #101 desiring Local LLMs.

I'll work on the proposed tasks in the next few days and will send separate PRs for each task.

What else would help is to establish some kinds coding guideline and having some GitHub Actions ensuring that those guideline are being followed.

chrissonntag avatar Feb 14 '24 16:02 chrissonntag

Can you refresh main and resubmit, and reach out to @agu3rra and @xssdoctor to discuss?

danielmiessler avatar Feb 16 '24 23:02 danielmiessler