Error: starting k8s agent: initializing chat session: Initialize not yet implemented for ollama
Hi,
I am currently trying to use kubectl-ai with ollama running on a remote system and I am getting the following error:
export OLLAMA_HOST=http://192.168.1.155:11434/ kubectl-ai --llm-provider ollama --model gemma3:12b-it-qat
Error: starting k8s agent: initializing chat session: Initialize not yet implemented for ollama
I am able to conenct to the ollama endpoint:
curl http://192.168.1.155:11434/
Ollama is running
kubectl-ai version is:
kubectl-ai version
version: 0.0.22
commit: f1516148e4bd9a39f0c3af63256fcf8c6e521ea9
date: 2025-08-14T18:00:57Z
Not sure if this is a bug?
Aaah.. this is a bug that got introduced recently. If you are blocked on it, pl use v0.0.20
@noahlwest
For llm providers that doesn't support have session persistence in:
- If user opts in to use session, then we should print warning.
- for in-memory, there shouldn't be any errors at all.
Ack, this one is from one of my recent changes. The issue here is is an oversight from adding the in memory chat store: calling Initialize no matter what, which would've been caught by a nil check when there was no in-memory store. Made #492 to log the error and return nil instead as a quick fix.
Thanks @noahlwest for the quick turnaround. https://github.com/GoogleCloudPlatform/kubectl-ai/pull/492 is now merged.
@infinitydon it will be available in the next release (v0.0.23).