[Feature]: Support / documentation for Azure OpenAI configuration
Which destkop app does this feature request relate to?
- [ ] Select a project 👇
- [x] Agent-TARS (cli, server, agent, tool etc.)
- [ ] UI-TARS Desktop
What problem does this feature solve?
I noticed that the CLI workspace wizard (agent-tars workspace --init) lets me pick “azure-openai” as the default model provider, but I couldn’t find any docs or examples that show:
- Whether Azure OpenAI is officially supported right now.
- What a minimal agent-tars.config.ts (or JSON/YAML) should look like for Azure OpenAI—including required fields like
endpoint,apiKey, deployment name, etc. - Any CLI-only flags I may need (e.g.
--azure.endpointvs--model.baseURL) .
What does the proposed features look like?
Steps I tried
agent-tars workspace --init # chose “azure-openai”
# then edited agent-tars.config.ts roughly like:
export default defineConfig({
model: {
provider: 'azure-openai',
id: 'gpt-4o',
apiKey: process.env.AZURE_OPENAI_API_KEY,
},
});
Starting with agent-tars --open results in:
InputError: Azure OpenAI endpoint is required.
If I add baseURL,
agent-tars workspace --init # chose “azure-openai”
# then edited agent-tars.config.ts roughly like:
export default defineConfig({
model: {
provider: 'azure-openai',
id: 'gpt-4o',
apiKey: process.env.AZURE_OPENAI_API_KEY,
baseURL: 'https://<resource>.openai.azure.com',
},
});
Starting with agent-tars --open results in:
Error: baseURL and endpoint are mutually exclusive
Thank you!
@Wendyfff0616 Thanks for your feedback! I often use azure-openai provider, but I noticed that our API serving provider are different, could you tell post the original model manufacturer’s documentation where you apply the azure-openai model, I can follow your step to reproduce it.
Thanks for the reply!
I followed the standard Microsoft Azure OpenAI setup as shown in their quickstart guide. Here's a simplified reference based on the documentation I used: Get Started.md
By the way, I just realized how you're currently using azure-openai in the codebase—you're using a third-party API proxy service that provides Azure OpenAI-compatible API (with AWS_CLAUDE_API_BASE_URL and model aws_sdk_claude37_sonnet), rather than Microsoft's native Azure OpenAI service?
model: {
provider: 'azure-openai',
baseURL: process.env.AWS_CLAUDE_API_BASE_URL,
id: 'aws_sdk_claude37_sonnet',
},
[Update]
I previously encountered the following error when running Agent TARS with my Azure OpenAI configuration:
Error: baseURL and endpoint are mutually exclusive
After investigation, I found this was caused by having the AZURE_OPENAI_ENDPOINT environment variable set in my terminal session. Once I unset this environment variable, the error disappeared.
New Problem
However, I now encounter a new error:
[Stream] Error in agent loop execution: Error: 400 Unrecognized request argument supplied: thinking
Analysis
After reviewing the Agent TARS code, I guess the root cause is as follows:
In UI-TARS-desktop/multimodal/agent/src/agent/llm-client.ts, the following logic is used when creating the LLM client:
// UI-TARS-desktop/multimodal/agent/src/agent/llm-client.ts
return createLLMClient(resolvedModel, (provider, request, baseURL) => {
// Add reasoning options for compatible providers
if (provider !== 'openai') {
request.thinking = reasoningOptions;
}
// ...
return requestInterceptor ? requestInterceptor(provider, request, baseURL) : request;
});
Since my provider is named 'azure-openai', the thinking field is automatically added to the request payload. However, the official Azure OpenAI API does not recognize the thinking parameter, resulting in a 400 error.
If it's allowed, maybe change if (provider !== 'openai') to if (provider !== 'openai' && provider !== 'azure-openai')?
For your information, my config is now set as follows:
// agent-tars.config.ts
export default defineConfig({
model: {
provider: 'azure-openai',
id: 'gpt-4o',
apiKey: process.env.AZURE_OPENAI_API_KEY,
baseURL: 'https://<resource>.openai.azure.com',
},
});
Thank you for your help!
Hi any update on this issue ? I am facing similar issues.