litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Feature]: Allow listing underlying models when using a wildcard

Open merlijn opened this issue 1 year ago • 9 comments

The Feature

Hi there. I am using this config

model_list:
  - model_name: "*"
    litellm_params:
      model: "openai/*"
      api_key: os.environ/OPENAI_API_KEY

It works in that I can call /chat/completions for example. However if I query /models or /v1/models I just get back a single model

{"data":[{"id":"*","object":"model","created":1677610602,"owned_by":"openai"}],"object":"list"}

It would be great if it would list the models available from openai so that I don't have to maintain that list in the UI app.

Motivation, pitch

It would be very convenient to auto populate available models in a UI. you can proxy the call to /models on the open ai (or other vendor) API and return the response.

Twitter / LinkedIn details

No response

merlijn avatar Jul 26 '24 15:07 merlijn

Hey @merlijn is this infering the models based on credentials in environment?

Can we do a 10min call to understand your use-case better? https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

krrishdholakia avatar Jul 26 '24 17:07 krrishdholakia

I would like to suggest a similar feature that allows users to set keys in a configuration file and load these keys using the function litellm.load_keys(fpath). This feature offers several advantages:

  1. Enhanced Security: By storing keys in a configuration file that is not uploaded, users can minimize the risk of key leakage.
  2. Reduced Redundancy: This feature would eliminate the need to copy and paste keys across multiple files, which can be particularly cumbersome when using multiple providers.
  3. Simplified Key Management: Centralizing key management in a single configuration file makes it easier for users to maintain and update their keys.

Configuration File Example:

- provider: "openai"
  models: [] # List of model names; if specified, only these models will be accessible
  api_keys: [] # List of API keys for redundancy and different budgets
  api_version: ... # Specify API version, if applicable
  api_base: ... # Specify API base URL, if applicable
  location: [] # List of locations, if applicable
  project: ... # Project identifier, if applicable

Justification:

  1. models: This field allows users to specify a list of models. If models are specified, the API will be restricted to calling only these models. If not specified, the API can call any model available from the provider. This provides greater control over the resources being used.

  2. api_keys: Users may have multiple API keys, either due to different budgets or to ensure availability in case one service is down. By allowing a list of keys, users can switch between them automatically as needed.

  3. Additional Fields: The api_version, api_base, location, and project fields provide additional configuration options to tailor the setup to specific requirements, ensuring flexibility and comprehensive control over the API usage.

lowspace avatar Jul 26 '24 18:07 lowspace

hey @lowspace how would you handle something like azure - where api_base + api_key + model is unique (not all api bases have access to all models, not all keys work for all api_bases)

krrishdholakia avatar Jul 26 '24 20:07 krrishdholakia

if so, use a unique id generated by the user as the primal key instead of the provider

lowspace avatar Jul 26 '24 20:07 lowspace

can you give me an example of what that might look like? @lowspace

krrishdholakia avatar Jul 26 '24 20:07 krrishdholakia

also can we move to a separate issue - will be easier to track

krrishdholakia avatar Jul 26 '24 20:07 krrishdholakia

This is just a suggestion. I didn't think it trough because I don't know the ins and outs of LiteLLM that well. I am playing around with a simple frontend app that allows users to switch between different vendors and models to gain some experience in coding with LLMs.

Litellm is ideal for this so that I can just write code for a single API, not all the different flavours of the different vendors. At least that is the idea.

In any case. It is just a convenience, quality of life feature. If need be I just maintain a list of models in the app. No problem.

merlijn avatar Jul 26 '24 20:07 merlijn

hey @merlijn got it - then i think what we can do is return the models available based on keys in the environment

what frontend app is this? Will help for e2e testing

krrishdholakia avatar Jul 26 '24 20:07 krrishdholakia

It is a Telegram bot actually. Like I said, this is a hobby project for me to learn coding with LLMs. Not some major project that is actually in production so don't prioritise this just for me. But maybe if there is more demand for it then great. Thanks for making this project FYI :)

merlijn avatar Jul 26 '24 20:07 merlijn

I can add a use case to this issue. I want to use OpenWebUI and have it use litellm instead of openai, its annoying to have to add every model for every provider to the config file manually. OpenWebUI must query the available models to show to users, but litellm does not return any models when I use a wildcard. What is worse is that "anthropic/*" shows up as a model to select in the UI.

nkeilar avatar Aug 29 '24 02:08 nkeilar