ollama-python icon indicating copy to clipboard operation
ollama-python copied to clipboard

Ollama Python library

Results 67 ollama-python issues
Sort by recently updated
recently updated
newest added

I keep seeing that openai has function calling [https://platform.openai.com/docs/api-reference/chat/create] (now called tooltypes) and some open source llm also support function calling. This is done by having the models fine tuned...

Really helpful project! However, I met some problem When I turn off WI-FI connection. - OS: Windows10 LTSC - cpu: R7-7840H - Language: Python ``` Traceback (most recent call last):...

I tried integrating Ollama into [GPT-Subtrans](https://github.com/machinewrapped/gpt-subtrans/tree/ollama-support) this weekend. It launches ollama in a subprocess and makes a series of requests to translate subtitles in batches. The first request works, but...

# :grey_question: About `ollama-python` is a very very convenient way to deal with local LLMs. Actually we can do a lot of stuff from the sdk facade, and I do...

How do you set the max token with ollama python? My code doesn't seem to work. Thanks! `output=ollama.chat(model='my_model',max_token=5, messages=[])`

I attempted to utilize the example, but encountered some errors as follows. Python 3.12.2 (main, Feb 6 2024, 20:19:44) [Clang 15.0.0 (clang-1500.1.0.2.5)] on darwin ``` import ollama from ollama import...

response = ollama.Client(host = 'http://xxxxx').generate(model='gemma',prompt=prompt,format='json',options={"seed": 101,"temperature": 0},keep_alive=7) run timeout, no resonse.

I want to use model glmchat-6b , read doc : Clone the HuggingFace repository (optional) If the model is currently hosted in a HuggingFace repository, first clone that repository to...

Add an example using continuous dialogue. ```pyhton import ollama def get_response(message_history): model_name = 'qwen:7b' try: response = ollama.chat(model=model_name, messages=message_history, stream=False) received_message = response['message'] return received_message['content'], received_message except Exception as e:...