semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

.Net: Support structured output for Ollama connector

Open crickman opened this issue 8 months ago • 1 comments

Ollama supports structured-output and the underlying ChatOptions define a ResponseFormat property.

https://ollama.com/blog/structured-outputs

Be great to enable structured output support via OllamaPromptExecutionSettings and provide a sample.

crickman avatar Apr 08 '25 20:04 crickman

Marking for Build as a nice to have

markwallace-microsoft avatar Apr 14 '25 15:04 markwallace-microsoft

it's great if support response format in Microsoft.SemanticKernel.Connectors.Ollama 1.62.0-alpha

zlbcdn avatar Jul 27 '25 09:07 zlbcdn

Hello, I'm also looking forward to this feature. It's a must for getting answers that can be used in code. I know there's a workaround (described in this answer). But not knowing to what extent Ollama supports the OpenAI protocol, I'd rather not venture down that path.

gjactat avatar Jul 28 '25 16:07 gjactat

@markwallace-microsoft - I wonder if this feature ask impacts MEAI since the SK connector is forwarding to IChatClient.

I wonder if MEAI already supports structured output for Ollama IChatClient?

crickman avatar Jul 28 '25 16:07 crickman

@gjactat
your comment help me a lot!!! thank you very much!! do you have any suggestion for response format settings in C#?

zlbcdn avatar Jul 29 '25 00:07 zlbcdn

@zlbcdn You can find an example of an OpenAI “structured output” chat here.

The relevant part (specifically concerning response format parameters) is as follows:

// Initialize ChatResponseFormat object with JSON schema of desired response format.
ChatResponseFormat chatResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
    jsonSchemaFormatName: "movie_result",
    jsonSchema: BinaryData.FromString("""
        {
            "type": "object",
            "properties": {
                "Movies": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "Title": { "type": "string" },
                            "Director": { "type": "string" },
                            "ReleaseYear": { "type": "integer" },
                            "Rating": { "type": "number" },
                            "IsAvailableOnStreaming": { "type": "boolean" },
                            "Tags": { "type": "array", "items": { "type": "string" } }
                        },
                        "required": ["Title", "Director", "ReleaseYear", "Rating", "IsAvailableOnStreaming", "Tags"],
                        "additionalProperties": false
                    }
                }
            },
            "required": ["Movies"],
            "additionalProperties": false
        }
        """),
    jsonSchemaIsStrict: true);

// Specify response format by setting ChatResponseFormat object in prompt execution settings.
var executionSettings = new OpenAIPromptExecutionSettings
{
    ResponseFormat = chatResponseFormat
};

gjactat avatar Jul 29 '25 07:07 gjactat

@gjactat get it ! thank you very much!

zlbcdn avatar Jul 29 '25 08:07 zlbcdn

@markwallace-microsoft - I wonder if this feature ask impacts MEAI since the SK connector is forwarding to IChatClient.

I wonder if MEAI already supports structured output for Ollama IChatClient?

Indeed... The Ollama "IChatClient" implementation seems to perfectly support "ResponseFormat". I'm going to read this page carefully to fully understand the ins and outs of SemanticKernel and Microsoft.Extensions.AI and how they cooperate.

gjactat avatar Jul 29 '25 10:07 gjactat

Yeah, the current workarounds for this seem to be to either use MEAI to get the SK IChatClient from the IChatCompletionService and then use that's GetResponseAsync<T> method, or to use the OpenAI Connector for SK instead of the Ollama connector, as mentioned on https://github.com/microsoft/semantic-kernel/issues/9919

Both of these are suboptimal solutions and both prevented me from showing this part of SK in a workshop last week, choosing instead to teach strongly-typed responses through MEAI's IChatClient which gives you a slightly better developer experience for that anyways.

In my workshop setup I give participants code that will work with OpenAI, Azure OpenAI, or local Ollama models and that tends to work well, but it definitely needs official support from the PromptExecutionSettings to be something I'm okay including in a workshop like that.

IntegerMan avatar Aug 16 '25 21:08 IntegerMan

I was wondering where was that option. In my memories, it always existed. But in the beginning I was using only the OpenAISettings with Ollama (that´s why).

Nowadays we can enforce the json format by passing to the PromptSettings Extension Data the dictionary key "response_format" and the value "json_object" (or a json element for schema formatting). And the description of the properties, schema and format in the user message. The settings extension does this when converting settings into ChatOptions: https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel.Abstractions/AI/PromptExecutionSettingsExtensions.cs#L91

So I'm also looking forward to this feature.

Maybe it is just an addition of a formal property for response format, like in the OpenAISettings: https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/Connectors/Connectors.OpenAI/Settings/OpenAIPromptExecutionSettings.cs#L178

tennaito avatar Aug 20 '25 21:08 tennaito