obsidian-copilot
obsidian-copilot copied to clipboard
Feature Request: ollama as a backend option
This is a really great local llm backend that works on a lot of platforms (including intel macs) and is basically a 1-click install.
Main site: https://ollama.ai/ API dosc: https://github.com/jmorganca/ollama/blob/main/docs/api.md Article about to indexing an obsidian vault: https://ollama.ai/blog/llms-in-obsidian