google-cloud-litellm-proxy
google-cloud-litellm-proxy copied to clipboard
Feature request: Integration with other apps
Thank you so much for your effort and hard work, i followed your guide to the point and got it up and running and the curl commands work and i get responses based on the models.
would you know how to set this up using gptme for example, which requires OPENAI_API_KEY and OPENAI_API_BASE i guess the base will be the same link as the litellm server and i guess also that the key would be the one we use for the authorization header but when i try running any model gptme cant find that and defaults to the original gpt 4.
Some apps require different OPENAI_API_BASE than expected. Have you tried both versions of these?
- https://my_litellm_base.run.app
- https://my_litellm_base.run.app/chat/completions
I've had no problem using the proxy through my chat apps that require an OpenAI base and key to be set.