Model System Message Mismatches
This isn't really an issue with this program. Just capturing an interesting mismatch with how "system" messages are interpreted differently based on the model used.
When using OpenAI's chat completions API , the system message is flexible; I can change it on the fly, in the middle of a conversation. It seems that when using Ollama with granite3.1-dense:2b the first system message in the context window always takes precedence.
Example
Using OpenAI's API with gpt-4o, it can change from talking like a 🏴☠ to a 🐕 in the same conversation:
$ openai chat
...
> erase
Chat history cleared.
> system: Talk like a pirate.
System context updated.
> Hello!
Ahoy there, matey! How be the wind in yer sails today? Be it a fine voyage yer on, or be ye in need of a bit o' pirate wisdom? Arrr!
> system: Talk like a dog.
System context updated.
> Hello!
Woof woof! Hey there, human! How's it goin'? Got any treats or belly rubs for me today? I'm all ears and waggin' tail!
Using Ollama with granite3.1-dense:2b, it only talks like a pirate (resuming the conversation from the previous chat):
$ OPENAI_MODEL="granite3.1-dense:2b" OPENAI_API_URL="http://localhost:11434/v1/" openai chat
...
> Hello!
Arrr, yer hearing's as sharp as a parrot's. Aye, salute to ye too, matey! Be it the morning or afternoon, remember to keep yer spirit high and thirst met with
water or grog. How be the weather on yer land today? Sun or storm, remember to keep yeself protected from the elements as a good pirate should.
Looking at the ollama console, the user-provided system message isn't visible here (only the default, which I guess make sense):
> /show system
[GIN] 2025/04/12 - 13:22:58 | 200 | 15.931458ms | 127.0.0.1 | POST "/api/show"
Knowledge Cutoff Date: April 2024.
You are Granite, developed by IBM.
Looking at the prompt template, looks like the root of the mismatch {{- if eq .Role "system" }}{{- continue }}:
> /show template
[GIN] 2025/04/12 - 13:24:53 | 200 | 15.441083ms | 127.0.0.1 | POST "/api/show"
<|start_of_role|>system<|end_of_role|>
{{- if and (gt (len .Messages) 0) (eq (index .Messages 0).Role "system")}}
{{- (index .Messages 0).Content}}<|end_of_text|>
{{- else }}
{{ .System }}
{{- if .Tools }} You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request.
{{- end }}
{{- end }}
{{- if .Tools }}
<|start_of_role|>tools<|end_of_role|>[
{{- range $index, $_ := .Tools }}
{{ . }}
{{- if and (ne (len (slice $.Tools $index)) 1) (gt (len $.Tools) 1) }},
{{- end}}
{{- end }}
]<|end_of_text|>
{{ else }} You are a helpful AI assistant.<|end_of_text|>
{{ end }}
{{- range $index, $_ := .Messages }}
{{- if eq .Role "system" }}
{{- continue }}
{{- end }}<|start_of_role|>
{{- if eq .Role "tool" }}tool_response
{{- else }}{{ .Role }}
{{- end }}<|end_of_role|>
{{- if .Content }}{{ .Content }}
{{- else if .ToolCalls }}<|tool_call|>
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
{{- end }}
{{- end }}
{{- if eq (len (slice $.Messages $index)) 1 }}
{{- if eq .Role "assistant" }}
{{- else }}<|end_of_text|>
<|start_of_role|>assistant<|end_of_role|>
{{- end }}
{{- else }}<|end_of_text|>
{{ end }}
{{- end }}
flowchart TD
%% OpenAI API Flow
subgraph OpenAI_API_Flow [OpenAI API Flow]
OA1[System: Talk like a pirate]
OA2[User: Hello!]
OA3[Bot: Responds in pirate tone]
OA4[System update: Talk like a dog]
OA5[User: Hello!]
OA6[Bot: Responds in dog tone]
OA1 --> OA2 --> OA3 --> OA4 --> OA5 --> OA6
end
%% Ollama Flow
subgraph Ollama_Flow [Ollama Flow]
OL1[System: Talk like a pirate]
OL2[User: Hello!]
OL3[Bot: Responds in pirate tone]
OL4[System update: Talk like a dog, ignored]
OL5[User: Hello!]
OL6[Bot: Responds in pirate tone]
OL1 --> OL2 --> OL3 --> OL4 --> OL5 --> OL6
end
https://github.com/ollama/ollama/blob/f50d691254e671e69975c4e54fc4d0469b538f10/llama/llama.cpp/src/llama-chat.cpp#L520-L532
https://github.com/ollama/ollama/blob/f50d691254e671e69975c4e54fc4d0469b538f10/server/prompt.go#L41-L62
https://github.com/ollama/ollama/blob/f50d691254e671e69975c4e54fc4d0469b538f10/server/images.go#L66
https://github.com/huggingface/huggingface.js/blob/68c62013109cf5d241bc4bcd01bc2b393995bc71/packages/ollama-utils/src/chat-template-automap.ts#L457-L465
https://github.com/ibm-granite/granite-3.1-language-models