ollama-python icon indicating copy to clipboard operation
ollama-python copied to clipboard

README doesn't mention that a running ollama server is required

Open jmccrosky opened this issue 5 months ago • 5 comments

It seems I'm not the only one that looked at the README and assumed that the library is taking care of running the backend, resulting in a "Connection Refused" error when I try the example code in the README. If I understand well, I need to first run the ollama server. This should perhaps be made clear in the README.

jmccrosky avatar Jan 26 '24 10:01 jmccrosky

this must be very frustrating for most of the users starting

g1ra avatar Jan 31 '24 07:01 g1ra

How can I run de Ollama Server? I'm encounter this error in my django App: [WinError 10061] No se puede establecer una conexión ya que el equipo de destino denegó expresamente dicha conexión Thank you so much for your help

diegodmb avatar Feb 03 '24 19:02 diegodmb

I solve it running the Docker Container. But it works really really slow using a light model like Phi.

diegodmb avatar Feb 07 '24 21:02 diegodmb

You need to have a local ollama server running to be able to continue. To do this:

  • Download: https://ollama.com/
  • Run an LLM: https://ollama.com/library
    • Example: ollama run llama2
    • Example: ollama run llama2:70b

There is a PR to add this to the docs (mentioned above).

connor-makowski avatar Feb 15 '24 20:02 connor-makowski

I had to update the Readme when working on iterative chats (chats with history), so I ended up migrating in these changes into the PR here: https://github.com/ollama/ollama-python/pull/64

connor-makowski avatar Feb 21 '24 11:02 connor-makowski