continue
continue copied to clipboard
Azure Foundry models don't work (Mistral Codestral 2501 with Azure)
Issue Category
Undocumented feature or missing documentation
Affected Documentation Page URL
https://docs.continue.dev/customize/model-providers/azure
Issue Description
Using Codestral 2501 available in the Azure Hub through Azure Foundry or Azure ML products have a different target URI that OpenAI Azure models.
Azure Open AI Target URI:
https://just-an-example.openai.azure.com/openai/deployments/gpt-4o-july/chat/completions?api-version=2023-03-15-preview
Foundry Codestral URL example (same process for AzureML deployed models):
https://just-an-example.openai.azure.com/chat/completions?api-version=2023-03-15-preview
The difference is just : openai/deployments/gpt-4o-july
How to configure such models ? Can you please update the Codestral documentation thats seems to be lights/wrong for Autocomplete Codestral in Azure ?
Adding this feature allows both Azure Foundry and Azure ML models to be deployed with continue dev because the URI are the same for both products (I know azure is messy).
RESOLUTION: just add a new value for ApiType = 'foundry' to set the goold URI.
Version: 0.8.66 IDE: VS Code
Thank you,
Expected Content
No response