llama-coder icon indicating copy to clipboard operation
llama-coder copied to clipboard

Replace Copilot local AI

Results 35 llama-coder issues
Sort by recently updated
recently updated
newest added

im wonking github in vscode and i get this error in codellama output: [info] Unsupported document: vscode-vfs://github/docker/dockercraft/docker-compose.yaml ignored. ![image](https://github.com/ex3ndr/llama-coder/assets/123007054/96bd5915-1cce-4fb8-a8e7-b9678fd0ad59)

Hi, I use VS Code's Remote SSH/Remote Containers plugin for most of my development. Testing out llama-coder initially I ran ollama on another machine on my local network, separate from...

As title says If I've already pulled the new (as of 2024-01-30) codellama-70b from meta (or python variant) Will Llama Coder use this? Or does it download the 34b and...

- Mac Mini M1 (Silicon) - macOS 13.6 Ventura - Cursor 0.26.2 (Cursor is a VS Code clone and seems to be compatible with Llama Coder) - Llama Coder I...

I have an ollama container running the stable-code:3b-code-q4_0 model. I'm able to interact with the model via curl: `curl -d '{"model":"stable-code:3b-code-q4_0", "prompt": "c++"}' https://notarealurl.io/api/generate` and get a response in a...

Hi. I installed it locally on my M1 and it works in CLI. When i click on Llama Coder in top right corner (status bar) of VS Code it does...

Always shows Unknown model undefined when trying to use. Ollama is running and different models tried. Reinstalled several time with no help.

There are superior coding models out now that are better than CodeLlama. Any plan to add support for those?

Is there a key binding or a way to key bind to pause the autocomplete?

this would allow it to be used with VSCodium (de-Microsofted VSCode) the registry is here: https://open-vsx.org/