noahlwest
noahlwest
Following this. I've seen some other instances of llm trying to give us a nicely formatted output only to end up breaking the JSON format, seems related.
@mikebz With gemini-2.5-pro-preview-03-25, EnableToolUseShim=true, and Quiet=true, and kubectl-ai built from head (commit 4d4005ae9c1831ccb61b1d27bf86a68b1917fc5a ) I've noticed JSON errors generally when running the eval suite, with `no JSON code block found...
Ack, this one is from one of my recent changes. The issue here is is an oversight from adding the in memory chat store: calling Initialize no matter what, which...
It could be helpful to know about your environment and the steps to reproduce: - OS: [e.g. Ubuntu 22.04] - kubectl-ai version (run `kubectl-ai version`): [e.g. 0.3.0] - LLM provider:...
I took a look at this, I'm pretty sure the reason it's not working as intended is because when we clear, we try to initialize the gollm chat with empty...
Right now in main.go we make this with `klogFlags.Set("log_file", filepath.Join(os.TempDir(), "kubectl-ai.log"))`. Maybe we change it to os.UserHomeDir() instead of temp dir?