feat: Local Models Support
- [x] I have looked for existing issues (including closed) about this As requested in #125
Feature Request
Motivation
We want to support local execution of LLMs. Starting with ollama.
Proposal
Alternatives
Local OpenAI inference is supported, including Ollama and LM Studio. Would it make sense to also support ollama's specific API? If so, would this be a new CompletionModel, or are these classified more as Tools?
Local OpenAI inference is supported, including Ollama and LM Studio. Would it make sense to also support ollama's specific API? If so, would this be a new CompletionModel, or are these classified more as Tools?
Yes it makes a lot of sense, as written in #147! It would be a new provider that directly implements ollama's API. Feel free to take this on as apart of your existing PR or create a new issue and new PR that tracks that!
Closing as this has already been solved by #285 (if there is anything outstanding we can keep this open).