rig icon indicating copy to clipboard operation
rig copied to clipboard

feat: Local Models Support

Open mateobelanger opened this issue 1 year ago • 2 comments

  • [x] I have looked for existing issues (including closed) about this As requested in #125

Feature Request

Motivation

We want to support local execution of LLMs. Starting with ollama.

Proposal

Alternatives

mateobelanger avatar Dec 05 '24 16:12 mateobelanger

Local OpenAI inference is supported, including Ollama and LM Studio. Would it make sense to also support ollama's specific API? If so, would this be a new CompletionModel, or are these classified more as Tools?

vacekj avatar Dec 28 '24 19:12 vacekj

Local OpenAI inference is supported, including Ollama and LM Studio. Would it make sense to also support ollama's specific API? If so, would this be a new CompletionModel, or are these classified more as Tools?

Yes it makes a lot of sense, as written in #147! It would be a new provider that directly implements ollama's API. Feel free to take this on as apart of your existing PR or create a new issue and new PR that tracks that!

0xMochan avatar Dec 28 '24 19:12 0xMochan

Closing as this has already been solved by #285 (if there is anything outstanding we can keep this open).

joshua-mo-143 avatar Jun 19 '25 23:06 joshua-mo-143