ollama-python
ollama-python copied to clipboard
Ollama Python library
The same question as #114 . In #114, annotation of `Client.embedding` is modified, but `AsyncClient` is forgotten.
Add processing and output to Ukrainian languages, please
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.4.1 to 0.4.3. Release notes Sourced from ruff's releases. v0.4.3 Changes Enhancements Add support for PEP 696 syntax (#11120) Preview features [refurb] Use function range for reimplemented-operator...
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.4.3 to 0.4.4. Release notes Sourced from ruff's releases. v0.4.4 Changes Preview features [pycodestyle] Ignore end-of-line comments when determining blank line rules (#11342) [pylint] Detect pathlib.Path.open calls...
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.4.3 to 0.4.5. Release notes Sourced from ruff's releases. v0.4.5 Changes Ruff's language server is now in Beta v0.4.5 marks the official Beta release of ruff server,...
![image](https://github.com/ollama/ollama/assets/78810304/452ca86f-941f-4ff7-b3bf-22c39a3e3c24) I analyzed the problem in depth. I get faster responses when I use the terminal, something is wrong with Python. Just use E-cores and its too slow. I HAVE:...
was using Ollama fine before upgraded to 0.1.38 ''' raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: model 'wizardlm2:7b' not foun ''' I am sure the files are in ~/.ollama/models. However, *...
I am trying ollama.Client to connect a remote server for chat. server A: http://192.168.0.123:11434, ollama installed with docker, ollama-python v0.2.0 local machine: m1 max macbook pro, ollama installed with docker,...
# Description I met the 503 code status error, when I run ollama to connect localhost for chat on apple silicon. Local machine: M3 Max MacBook Pro, Ollama, llama3, python3.11...