guardrails icon indicating copy to clipboard operation
guardrails copied to clipboard

Make LiteLLM the default llm api

Open zsimjee opened this issue 9 months ago • 1 comments

Hi @krrishdholakia my comment on parity is incorrect. I'm not sure what I meant, but I might have meant the parity within our codebase - we didn't support streaming for litellm.

I've also changed the next steps here.

For 0.5.0, I want to make litellm the DEFAULT llm handler we provide first class support for. We will change all runbooks etc to use litellm. We will make the llm_api parameter in guard calls optional, and pass through all args provided in that call directly to an internal litellm chat client.

Originally posted by @zsimjee in https://github.com/guardrails-ai/guardrails/discussions/680#discussioncomment-9348176

  • [x] Use the openai interface for our main callable - Guard.call. We do not need to explicitly do this. Instead, we can take all args and kwargs and pass them through to the litellm sdk
  • [ ] use the same interface within validators that use LLMs
  • [ ] support batch and async litellm workflows
  • [x] Make the llm_callable param in Guard.call optional. When not provided, but an arg is passed that litellm uses to determine the model (the model arg) then automatically create and use a LiteLLM client. For async, use acreate and attach to the event loop if it exists by now.
  • [ ] make changes in the Guardrails API that lets users pass the same params over the wire, automatically use a generated LiteLLM client to make llm requests on the server
  • [x] make sure custom callables still work

zsimjee avatar May 08 '24 00:05 zsimjee

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 14 days.

github-actions[bot] avatar Aug 10 '24 01:08 github-actions[bot]

This issue was closed because it has been stalled for 14 days with no activity.

github-actions[bot] avatar Aug 24 '24 03:08 github-actions[bot]