Graeme Malcolm

Results 10 comments of Graeme Malcolm

Here are the connection details I get: ``` ai-aihub911642661207_aoai Target: https://ai-aihub911642661207.openai.azure.com/ AuthType: ApiKey ai-aihub911642661207 Target: https://ai-aihub911642661207.cognitiveservices.azure.com/ AuthType: ApiKey ``` This may be related to the following error in the Foundry...

Does the OpenAI Chat Completions client work with non-OpenAI models? The lab uses a Phi-4 model.

@nick863 This issue still present when using Azure.AI.Projects 1.0.0-beta.6 and Azure.AI.Inference 1.0.0-beta.4 The UI bug in the portal has been resolved, so I assume that's unrelated.

@dargilco @nick863 The issue is still present in 1.0.0-beta-8 Can you please /unresolve until a fix is in-place? Or confirm that this issue won't be fixed?

Sorry for the delayed response (I was on vacation) The issue still exists in Azure.AI.Projects --version 1.0.0-beta.9. To be clear, what I'm seeing is inconsistent behavior between the Python and...

It works in a Foundry-based project for Python, but not .NET I've not tried a hub-based project (will try tomorrow); but to be honest, that's not really a solution -...

The problem is not with models deployed to an OpenAI endpoint. It's with using the OpenAI SDK to chat with OpenAI models deployed to the default Foundry Models endpoint. Cheers,...

To repro: 1. Follow the instructions at https://microsoftlearning.github.io/mslearn-ai-studio/Instructions/02a-AI-foundry-sdk.html to create an AI Foundry project, deploy a gpt-4o model (which by default is deployed to an Azure AI Model Inference endpoint),...

OK, think I've figured this out. You can use *either* endpoint (Azure AI Foundry project or Azure OpenAI) to get a project client, but you ***must*** explicitly specify the connection...

Yup - resolved as long as you use explicit parameters - many thanks