Fabric
Fabric copied to clipboard
[FEATURE] Introduce LiteLLM to fabric
This PR introduces LiteLLM to fabric. Instead of only the OpenAI API, users are now able to use a huge variety of LLM Providers and Models. The most significant change in this PR is that LiteLLM takes over the inference from the former openai library.
What could be next:
- Have a better Exception-Handling - to improve the User Experience
- Extend Setup - Allow defining several environment variables, not just the
OPENAI_API_KEY - Integrate LiteLLM into the Server Component
Oh yes, definitely, we definitely want this. Let's get the clients working super smooth this week and let's look into this. Thank you!
Ok, will do. I'll keep you posted 👍🏻
Sweet!
@danielmiessler Did you have time to test the LiteLLM integration and would be willing to merge the PR? There's another feature request #101 desiring Local LLMs.
I'll work on the proposed tasks in the next few days and will send separate PRs for each task.
What else would help is to establish some kinds coding guideline and having some GitHub Actions ensuring that those guideline are being followed.
Can you refresh main and resubmit, and reach out to @agu3rra and @xssdoctor to discuss?