amitrintzler
amitrintzler
it didnt help as using the cli and when typing the kubectl-ai --mcp-server , it doesnt trigger any log, not even in the location mentioned above.
I tried to run this from inspector to analyze the content of it: kubectl-ai --llm-provider=azopenai --model=gpt-4o-128k --mcp-server Tried also in cmd but not sure if support SSE yet. I want...
any reason why Ollama models not supported yet by the agent in vscode? i can use Ollama only for ask mode and not for agent