ollama-python
ollama-python copied to clipboard
Suggestion: Use models to encapsulate request/responses
consider using models to properly encapsulate request/responses
for example
import ollama
response = ollama.chat(model='llama2', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
would become something like
import ollama
from ollama.models import ChatMessageRequest, ChatMessageResponse
messages: [ChatMessageRequest] = [ChatMessageRequest('user' , 'Why is the sky blue?')]
response: ChatMessageResponse = ollama.chat(model='llama2', messages=messages)
print(response.content)
:heavy_plus_sign: :one: