gateway icon indicating copy to clipboard operation
gateway copied to clipboard

Support for setting a custom base url for each providers alongside the key

Open Ran-Mewo opened this issue 1 year ago • 8 comments

It'll be useful to add an optional base url parameter alongside the key for each providers Some organizations and companies only allow access to their language models through a reverse proxy so having the ability to set a custom base url becomes mandatory for any communication with the language models through some sort of library like this.

I'll be happy with just having the ability to set base urls alongside the keys, this API would also allow people to have fallback mechanisms if they have multiple proxies like how it currently allows fallback for multiple keys

But for the future roadmap on having advanced configuration it might be useful to have an option to set a custom url for each endpoint required by the providers though looking at how the API makes the requests I don't think this is necessary as you can just specify the custom path by changing the /v1/chat/completions to something else when you make a request

Ran-Mewo avatar Jan 14 '24 12:01 Ran-Mewo

Hi @Ran-Mewo,

This makes sense.

Curious to know, why would organisations have another proxy after the gateway? Trying to understand the use cases.

ayush-portkey avatar Jan 15 '24 23:01 ayush-portkey

Oh that's not quite what I meant, true that it doesn't quite make sense for oganizations to have another proxy if they're already going to use the gateway (unless they have some reason for it)

Some organizations provide AI models for personal usage too and so the users can use gateway to link up all the different proxies for themselves Like if your organization only provides Anthropic and you have access to OpenAI for yourself then you can use gateway to link up Anthropic to your organization while linking OpenAI to your own

It's more like for people who have access to reverse proxies and the proxy creator themselves doesn't want to make the proxy use gateway. Or their reverse proxy only provides a few models and you want to use gateway to link up everything so you link up the reverse proxies to their providers & link up the rest with your own keys

And reverse proxies isn't just a organization thing so it's better to have that option to people who want to use it

Ran-Mewo avatar Jan 16 '24 10:01 Ran-Mewo

I sort of want something like this if it's doable

// Load balancing between 2 OpenAI proxies
{
  "strategy": {
      "mode": "loadbalance"
    },
  "targets": [
    {
      "provider": "openai",
      "base_url": "<reverse proxy url>", // by default it's https://api.openai.com/v1 
      "api_key": "<the authorization key to the reverse proxy>"
    },
    {
      "provider": "openai",
      "base_url": "<another reverse proxy url>", // by default it's https://api.openai.com/v1 
      "api_key": "<the authorization key to the reverse proxy>"
    }
  ]
}

Ran-Mewo avatar Jan 16 '24 11:01 Ran-Mewo

+1 Hope to add a custom baseURL option.

Use Case

https://www.ohmygpt.com/

The pricing for this website's GPT-3.5 is more affordable than OpenAI's, so I'll use this website.

curl '127.0.0.1:8787/v1/chat/completions' \
  -H 'x-portkey-baseURL: https://api.ohmygpt.com/v1' \
  -H 'x-portkey-provider: openai' \
  -H "Authorization: Bearer $OPENAI_KEY" \
  -H 'Content-Type: application/json' \
  -d '{"messages": [{"role": "user","content": "Say this is test."}], "max_tokens": 20, "model": "gpt-3.5-turbo"}'

// Load balancing between 2 OpenAI proxies
{
  "strategy": {
      "mode": "loadbalance"
    },
  "targets": [
    {
      "provider": "openai",
      "base_url": "https://api.openai.com/v1 ", // by default it's https://api.openai.com/v1 
      "api_key": "sk-xxxxxx"
    },
    {
      "provider": "openai",
      "base_url": "https://api.ohmygpt.com/v1", // by default it's https://api.openai.com/v1 
      "api_key": "key-xxxxxx"
    }
  ]
}

HereOrCode avatar Jan 17 '24 12:01 HereOrCode

Hi, Any updates?

HereOrCode avatar Jan 22 '24 07:01 HereOrCode

Hey, sorry for the delay here!

Quick update: We are actively considering this and thinking that it can also help support local models like the ones from Ollama (#14), as well as partially #130

vrushankportkey avatar Jan 23 '24 15:01 vrushankportkey

Anymore progress on this or is this still in consideration? Gateway would be massively useful to me but sadly I sort of need this feature in order to even start using it

Ran-Mewo avatar Feb 01 '24 09:02 Ran-Mewo

Hey @Ran-Mewo - The PR for ollama is a WIP and should be merged soon. In that PR itself, we will introduce a base_url param that you can use to override the provider's base url. I hope that will solve your problem. I will update here once the PR is ready to review.

VisargD avatar Feb 01 '24 11:02 VisargD

This is a very useful parameter. Purpose: For security reasons, the enterprise defines a customized gateway. All egress networks must be accessed through a specified proxy. This is very useful.

zsinba avatar Feb 14 '24 04:02 zsinba

wait is it here yet? If so could the examples to use it be provided? most specifically some sort of load balancing between 2 OpenAI proxies

Ran-Mewo avatar Feb 22 '24 09:02 Ran-Mewo

@Ran-Mewo

To see the updates, you can now use custom URLs, but there is no documentation on how to use them.

fix: custom host based url creation for proxy routes https://github.com/Portkey-AI/gateway/commit/2a25274c3e740be8141ef18b713a92e22ca3dd65

HereOrCode avatar Feb 22 '24 11:02 HereOrCode

@Ran-Mewo @code4you2021 folks, so sorry for the delay!!

Sharing comprehensive docs here -

  • How to use the custom host property: https://portkey.ai/docs/product/ai-gateway-streamline-llm-integrations/universal-api#integrating-local-or-private-models
  • Ollama documentation: https://portkey.ai/docs/welcome/integration-guides/ollama
  • Bring your own LLM: https://portkey.ai/docs/welcome/integration-guides/byollm

Please let me know if this is helpful. Closing this issue since it's a few months old now. We can connect on email at [email protected], Discord at https://portkey.ai/community, or just feel free to open up a new issue if you'd like to ask/confirm anything!

vrushankportkey avatar Apr 25 '24 07:04 vrushankportkey