emacs-copilot
emacs-copilot copied to clipboard
Ollama integration
Hi, thanks for publishing this. If I may, I'd like to suggest/ask that you make a version that works with one of the common LLM management systems that are coming out lately. Ollama is one (my preference) but gpt4all is another example. They run the llm internally and offer up a rest api for callers to consume, doing the work of trying to make the most of whatever hardware the user has available (mixed gpu and cpu if possible/needed, etc). It would be more extensible than calling llama directly.
Second that. Ollama support would be great.
This one works charmingly well with Ollama: https://github.com/s-kostyaev/ellama