potpie icon indicating copy to clipboard operation
potpie copied to clipboard

How to override the openai API URL?

Open weekendli opened this issue 10 months ago • 5 comments

It seems potpie can only use third party models by using openrouter. It could be great to expose the LLM service ULR to .env, such that we can config for openai compatible services.

thanks

weekendli avatar Feb 28 '25 01:02 weekendli

Hi @weekendli ! Thanks for using potpie!

You can use any provider supported by litellm by setting the following variables in your env file:

LLM_PROVIDER=provider_name
LLM_API_KEY=provider_key
LOW_REASONING_MODEL=provider/model_name
HIGH_REASONING_MODEL=provider/model name

dhirenmathur avatar Feb 28 '25 06:02 dhirenmathur

@weekendli may I close this if there are no further question?

dhirenmathur avatar Mar 03 '25 06:03 dhirenmathur

it seems the provider name is not enough, if the provider is not config in potpie. Currently we have some self hosted models, which are OpenAI compatible. Because the API address is not configurable, then we are not be able to use those self-hosted models.

If we can config the model API address, e.g. by changing from open.com to others. that could be great.

FYI, litellm do support OpenAI compatible API: https://docs.litellm.ai/docs/providers/openai_compatible

thanks.

weekendli avatar Mar 03 '25 06:03 weekendli

@weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have to override to make it work for non-azure models https://github.com/potpie-ai/potpie/blob/03088e21c3a0e9b3c752d22cbda3c9b9cd9af165/app/modules/intelligence/provider/provider_service.py#L230C34-L230C35

dhirenmathur avatar Mar 26 '25 11:03 dhirenmathur

thanks for the following up.

On Wed, Mar 26, 2025 at 7:32 PM Dhiren Mathur @.***> wrote:

@weekendli https://github.com/weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have to override to make it work for non-azure models https://github.com/potpie-ai/potpie/blob/03088e21c3a0e9b3c752d22cbda3c9b9cd9af165/app/modules/intelligence/provider/provider_service.py#L230C34-L230C35

— Reply to this email directly, view it on GitHub https://github.com/potpie-ai/potpie/issues/292#issuecomment-2754095943, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADTXDXILI3HXHXMK2CN55TD2WKF6PAVCNFSM6AAAAABYBG4WQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJUGA4TKOJUGM . You are receiving this because you were mentioned.Message ID: @.***> [image: dhirenmathur]dhirenmathur left a comment (potpie-ai/potpie#292) https://github.com/potpie-ai/potpie/issues/292#issuecomment-2754095943

@weekendli https://github.com/weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have to override to make it work for non-azure models https://github.com/potpie-ai/potpie/blob/03088e21c3a0e9b3c752d22cbda3c9b9cd9af165/app/modules/intelligence/provider/provider_service.py#L230C34-L230C35

— Reply to this email directly, view it on GitHub https://github.com/potpie-ai/potpie/issues/292#issuecomment-2754095943, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADTXDXILI3HXHXMK2CN55TD2WKF6PAVCNFSM6AAAAABYBG4WQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJUGA4TKOJUGM . You are receiving this because you were mentioned.Message ID: @.***>

weekendli avatar May 09 '25 03:05 weekendli