Codebase Indexing: "Ollama model is not embedding capable"
Description
i tried using the recommended embedding models (mxbai-embed-large and nomic-embed-text:latest as well as bge-m3:567m). each displays "Error - Ollama model is not embedding capable: (model name)" upon selection.
Kilo code version 4.64.3
Pardon for jumping in as I also use Ollama embedding models and saw this issue. I can confirm that Ollama embeddings work fine on Kilo Code version 4.82.0.
To really confirm this issue, @spookynando need to check if the local has been running properly using curl:
curl -H "Content-Type: application/json" http://localhost:11434/api/embed -d '{"model": "<ollama-embed-model>", "input": ["text"]}'
If the curl return embedding correctly but the issue persists, then it might be a bug somewhere in Kilo Code.
kilocode.kilo-code: 4.93.2
Not work embeddinggemma
This has to do with ollama, I recently upgraded ollama, and the same working models stopped working
In ollama logs everything looks fine, but it doesn't work in kilo code.
TLDR; Set model dimensions to 768
While model dimensions like 1536 or 2048 are standard vector sizes for other popular models, they are not native to the Google EmbeddingGemma.
This model is trained to produce a maximum embedding dimension of 768. Requesting a larger, non-native size results in an incompatible configuration and failure on the Ollama side.
Try setting model dimensions to any of the following:
| Dimension | Priority for Speed/Storage |
|---|---|
| 768 | Quality |
| 512 | Balance |
| 256 | High Speed |
| 128 | Max Speed & Storage Savings |