ipex-llm
ipex-llm copied to clipboard
Ollama version too old to use it with VSCode and Copilot
Describe the bug VSCode with copilot needs at least version 0.6.4 of Ollama according this issue https://github.com/microsoft/vscode-copilot-release/issues/8461
I tried to install latest version of ollama by using pip install --pre --upgrade 'ipex-llm[cpp]' , also not sure which version of Ollama it uses, ./ollama -v returns 0.0.0
When adding Ollama inside VSCode I got this error
Failed to register Ollama model: TypeError: Cannot read properties of undefined (reading 'includes')
The ollama version is 0.6.2, it looks like copilot needs some api in newer ollama. We will notify you when we finish the update.
same here