ollama-python icon indicating copy to clipboard operation
ollama-python copied to clipboard

Ollama Python library

Results 67 ollama-python issues
Sort by recently updated
recently updated
newest added

Chat with history is perhaps the most common use case. In fact ``ollama run`` works like that. An example with that use case will be great for the newcomers. Here's...

Hi, I would like to reopen the issue, as the suggestion does not work, thanks: https://github.com/ollama/ollama-python/issues/84

I checked the pull progress example , it seems to return few progresss info , is it possible to get total progress info? for example i need to make a...

simple codes like below `ollama.chat(model='mistral:instruct', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])` OR ``` import ollama response = ollama.chat(model='mistral:instruct', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?',...

Is there a way (or could there be) to select which GPU to run on when generating chat responses?

Hello all, I´m trying to use the system message as described below. Evertytime I use it I don´t have any answer from the LLM. messages = [ {'role': 'system', 'content':...

hello, I failed to run this example after install ollama and llava model. paste run log here. Please help to take a look on this issue, and if possible, update...

# :grey_question: About While trying to play with `ollama-python` on a brand new env, I found out that following the current steps of the README do not take in account...

It seems I'm not the only one that looked at the README and assumed that the library is taking care of running the backend, resulting in a "Connection Refused" error...

Ollama may be protected by a reverse proxy enforcing basic auth. When the Ollama URL contains basic auth elements, ollama-python removed them from the URL leading in a HTTP 401...