How to override the openai API URL?
It seems potpie can only use third party models by using openrouter. It could be great to expose the LLM service ULR to .env, such that we can config for openai compatible services.
thanks
Hi @weekendli ! Thanks for using potpie!
You can use any provider supported by litellm by setting the following variables in your env file:
LLM_PROVIDER=provider_name
LLM_API_KEY=provider_key
LOW_REASONING_MODEL=provider/model_name
HIGH_REASONING_MODEL=provider/model name
@weekendli may I close this if there are no further question?
it seems the provider name is not enough, if the provider is not config in potpie. Currently we have some self hosted models, which are OpenAI compatible. Because the API address is not configurable, then we are not be able to use those self-hosted models.
If we can config the model API address, e.g. by changing from open.com to others. that could be great.
FYI, litellm do support OpenAI compatible API: https://docs.litellm.ai/docs/providers/openai_compatible
thanks.
@weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have to override to make it work for non-azure models https://github.com/potpie-ai/potpie/blob/03088e21c3a0e9b3c752d22cbda3c9b9cd9af165/app/modules/intelligence/provider/provider_service.py#L230C34-L230C35
thanks for the following up.
On Wed, Mar 26, 2025 at 7:32 PM Dhiren Mathur @.***> wrote:
@weekendli https://github.com/weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have to override to make it work for non-azure models https://github.com/potpie-ai/potpie/blob/03088e21c3a0e9b3c752d22cbda3c9b9cd9af165/app/modules/intelligence/provider/provider_service.py#L230C34-L230C35
— Reply to this email directly, view it on GitHub https://github.com/potpie-ai/potpie/issues/292#issuecomment-2754095943, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADTXDXILI3HXHXMK2CN55TD2WKF6PAVCNFSM6AAAAABYBG4WQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJUGA4TKOJUGM . You are receiving this because you were mentioned.Message ID: @.***> [image: dhirenmathur]dhirenmathur left a comment (potpie-ai/potpie#292) https://github.com/potpie-ai/potpie/issues/292#issuecomment-2754095943
@weekendli https://github.com/weekendli we recently added support for using LLM_API_BASE env variable as part of supporting azure openai, but there is a simple check for azure in provider_service that you would have to override to make it work for non-azure models https://github.com/potpie-ai/potpie/blob/03088e21c3a0e9b3c752d22cbda3c9b9cd9af165/app/modules/intelligence/provider/provider_service.py#L230C34-L230C35
— Reply to this email directly, view it on GitHub https://github.com/potpie-ai/potpie/issues/292#issuecomment-2754095943, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADTXDXILI3HXHXMK2CN55TD2WKF6PAVCNFSM6AAAAABYBG4WQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJUGA4TKOJUGM . You are receiving this because you were mentioned.Message ID: @.***>