guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Ollama support

Open lestan opened this issue 1 year ago • 2 comments

I use Ollama as my inference server for local LLMs. Ollama is supported by many LLM frameworks, but not Guidance.

Would love to see a direct integration with Ollama via the models package.

I'm aware that LiteLLM support is available and can be used to proxy Ollama, but that adds overhead and makes the solution more complex.

Supporting Ollama would immediately enable support for all the models Ollama makes available: model library

lestan avatar Jan 24 '24 02:01 lestan

May I ask if any updates?

eliranwong avatar Apr 19 '24 06:04 eliranwong

See also:

  • https://github.com/guidance-ai/guidance/issues/687
  • https://github.com/guidance-ai/guidance/issues/648

0xdevalias avatar Jun 20 '24 00:06 0xdevalias