llm
llm copied to clipboard
prompt(messages=[...]) to pass a full array of messages
Need this to support implementing a Chat Completions API.
I had Claude Code Interpreter do some experiments around this, details in this transcript: https://claude.ai/share/9d5b0729-b58d-4b15-9e45-ab1e7152b89e
And this branch commit: https://github.com/simonw/llm/commit/697636222a277f892dfffef86b2471cc89c95576
Most interesting of which is this document:
- https://github.com/simonw/llm/blob/697636222a277f892dfffef86b2471cc89c95576/message_matching_notes.md
My initial idea for a design looks like this:
response = model.prompt(messages=[
llm.System("you are a useful assistant"),
llm.User("Capital of France?"),
llm.Assistant("Paris"),
llm.User("Germany?")
])
I guess you could do this as a shortened version:
from llm import User as U, System as S, Assistant as A
response = model.prompt(messages=[
S("you are a useful assistant"),
U("Capital of France?"),
A("Paris"),
U("Germany?")
])
Possibly a dupe of: https://github.com/simonw/llm/issues/894