guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Integrate Ollama

Open aniketmaurya opened this issue 1 year ago • 9 comments

Is your feature request related to a problem? Please describe. Ollama provides fast local LLM inference and would be great to integrate with Guidance.


PS: I would love to contribute in this.

aniketmaurya avatar Mar 10 '24 14:03 aniketmaurya

Ollama is already supported via LiteLLM, You can use it like so


from guidance import models, gen , select

llama2 = models.LiteLLMCompletion(
    model=f"ollama/llama2",
    api_base="http://localhost:11434"
)
# capture our selection under the name 'answer'
lm = llama2 + f"Do you want a joke or a poem? A {select(['joke', 'poem'], name='answer')}.\n"

# make a choice based on the model's previous selection
if lm["answer"] == "joke":
    lm += f"Here is a one-line joke about cats: " + gen('output', stop='\n')
else:
    lm += f"Here is a one-line poem about dogs: " + gen('output', stop='\n')

Warlord-K avatar Mar 11 '24 12:03 Warlord-K

I got this error TypeError: LiteLLM.__init__() got an unexpected keyword argument 'api_base'

Is there a specific version I should roll back to make that work?

Edit: Apparently, the word api_base is nowhere to be found in the code base anymore.

zvxayr avatar Mar 24 '24 05:03 zvxayr

May I ask if any updates. I would like to use ollama as a backend. Thanks.

eliranwong avatar Apr 19 '24 06:04 eliranwong

I would also like to use ollama as a backend. Is there work to build native support for ollama/llama3?

nurena24 avatar May 21 '24 14:05 nurena24

OK, so how do we set api_base now?

EDIT: I looked at #648 and the rest of the code base, and the fact that the litellm tests seem to have been stubbed out makes me think that ollama/litellm support is not a priority here and that this is going down the happy path of mainstream hosted APIs being the only real test targets... Otherwise there would have been a fix based on #648 merged by now.

rcarmo avatar May 26 '24 10:05 rcarmo