litellm
litellm copied to clipboard
Adding native support for Snowflake's Cortex LLM service
Title
Adding native support for Snowflake's Cortex LLM endpoint (https://docs.snowflake.com/en/user-guide/snowflake-cortex/llm-functions)
Type
🆕 New Feature 📖 Documentation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| litellm | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Oct 15, 2024 0:13am |
Ack
Hi @sfc-gh-alherrera can you add tests for sync + async completion + streaming here (start function name with test_- https://github.com/BerriAI/litellm/blob/fac3b2ee4238e614dc1f077475d9943dbafbc3a4/tests/local_testing/test_completion.py#L4
It's okay if the backend is mocked
e.g. here's how we do it for huggingface - https://github.com/BerriAI/litellm/blob/fac3b2ee4238e614dc1f077475d9943dbafbc3a4/tests/local_testing/test_completion.py#L1806
Once you're done, just share a screenshot of it passing your tests and it should be good to merge!
@krrishdholakia this is ready for review. Here are the test screenshots. Having issues with asynch support, but complete streaming complete are good to go.
@krrishdholakia do you have an comments on the latest changes? thanks
Hi @krrishdholakia , wondering if this PR looks good ? Would be really helpful if we can get this feature
hey @sumitdas66 thanks for the bump - i'll try and review it this week.
We do want to add snowflake support, so aligned on ensuring a solution is out.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.