Connor Makowski

Results 8 comments of Connor Makowski

Related to: https://github.com/ollama/ollama-python/issues/63 And: https://github.com/ollama/ollama-python/pull/64

You need to have a local ollama server running to be able to continue. To do this: - Download: https://ollama.com/ - Run an LLM: https://ollama.com/library - Example: `ollama run llama2`...

I had to update the Readme when working on iterative chats (chats with history), so I ended up migrating in these changes into the PR here: https://github.com/ollama/ollama-python/pull/64

Worth mentioning that there are a few ways to do this. For example, you might also add something like: ``` ## Local Ollama Setup You need to have a local...

Referencing this to: https://github.com/ollama/ollama-python/pull/155 as an extension for adding clarity to the getting started process.

For general use as shown in most examples, you should have a local ollama server running to be able to continue. To do this: - Download: https://ollama.com/ - In your...

This is verbage as part of the PR: https://github.com/ollama/ollama-python/pull/64

It is also worth noting that you are using an `await`. Are you using an async client? For a non async client you do not need await: ```python import ollama...