ollama-python
ollama-python copied to clipboard
README doesn't mention that a running ollama server is required
It seems I'm not the only one that looked at the README and assumed that the library is taking care of running the backend, resulting in a "Connection Refused" error when I try the example code in the README. If I understand well, I need to first run the ollama server. This should perhaps be made clear in the README.
this must be very frustrating for most of the users starting
How can I run de Ollama Server? I'm encounter this error in my django App: [WinError 10061] No se puede establecer una conexión ya que el equipo de destino denegó expresamente dicha conexión Thank you so much for your help
I solve it running the Docker Container. But it works really really slow using a light model like Phi.
You need to have a local ollama server running to be able to continue. To do this:
- Download: https://ollama.com/
- Run an LLM: https://ollama.com/library
- Example:
ollama run llama2
- Example:
ollama run llama2:70b
- Example:
There is a PR to add this to the docs (mentioned above).
I had to update the Readme when working on iterative chats (chats with history), so I ended up migrating in these changes into the PR here: https://github.com/ollama/ollama-python/pull/64