ollama-python icon indicating copy to clipboard operation
ollama-python copied to clipboard

Persistent chat memory

Open kikoferrer opened this issue 6 months ago • 7 comments

I have ollama-python running with a custom ollama model. It works very well except that it does not remember the conversation at all. Every chat is like a new conversation.

I checked issues and I cant find the same problem I am having.(Or I did not look enough. Forgive me.)

I made this basic prototype with chat stream to test.

import ollama


model = 'Llama3'

def chat(message):
    messages = [{
    'role': 'user',
    'content': message,
    }]
    response = ollama.chat(model=model, messages=messages, stream=True)
    for line in response:
        print(line['message']['content'], end='', flush=True)

while True:
    print('\nQ to quit')
    prompt = input('Enter your message: ')
    if prompt.lower() == 'q':
        break
    else:
        chat(prompt)

Is there a doc somewhere I can use as guide for more use cases like persistent memory/context using the model's context size? Similar to how Ollama run works where the chat is continuous. Thanks.

kikoferrer avatar Jul 31 '24 09:07 kikoferrer