Support / Document Azure OpenAI via OpenAI-compatible /openai/v1 endpoint
Summary
Strix works correctly with Azure OpenAI when using Azure’s OpenAI-compatible API (/openai/v1), but this is currently undocumented and non-obvious for users.
This issue proposes documenting the correct configuration so users can run Strix against Azure OpenAI without any code changes.
Background
Strix uses LiteLLM and communicates with LLMs using the OpenAI-compatible API.
Azure OpenAI exposes a fully compatible endpoint at: https://<RESOURCE-NAME>.cognitiveservices.azure.com/openai/v1/
When this endpoint is used as LLM_API_BASE, Strix works out of the box.
However, many Azure users attempt to use legacy Azure endpoints such as:
/openai/deployments/
These legacy endpoints are not OpenAI-compatible and will not work with Strix.
Verified Working Configuration
No Strix code changes are required.
Environment Variables
# Model / deployment name
export STRIX_LLM="openai/gpt-4o"
# or: openai/<azure-deployment-name>
# Azure OpenAI API key
export LLM_API_KEY="AZURE_OPENAI_API_KEY"
# Azure OpenAI OpenAI-compatible base URL
export LLM_API_BASE="https://<RESOURCE-NAME>.cognitiveservices.azure.com/openai/v1/"
Important notes:
STRIX_LLM must match the Azure deployment name
LLM_API_BASE must include /openai/v1/
No api-version query parameter is required