Simon Willison
Simon Willison
Options for how this could work. I think it's important to be able to return both attachments AND a regular response too. 1. Tools currently return a string or a...
This is going to need a database change, as the database records of tool results need to be able to record when a tool returned one or more attachments.
Got as far as this error, got a bit stuck: ``` openai.BadRequestError: Error code: 400 - {'error': {'message': "An assistant message with 'tool_calls' must be followed by tool messages responding...
Maybe the problem here is that the user message with the attachment is should come _after_ the `"role": "tool"` message, not before.
OK the branch I just shipped can now do this (with a temporary testing `fetch_image_url` tool): ```bash llm -T fetch_image_url 'Tell me about https://static.simonwillison.net/static/2025/two-pelicans.jpg' --td ``` Output: > ``` >...
Lots still to solve: - Does it work with async / streaming / non-streaming? - Are there cases where it breaks the `messages` ordering in some way (I had to...
Here's the current `llm logs -c` output for that: > # 2025-06-01T00:58:40 conversation: 01jwmfyxgjye8scty1zmpqt3rb id: 01jwmfyxgmcjcye6ad9w856aag > > Model: **gpt-4.1-mini** > > ## Prompt > > Tell me about https://static.simonwillison.net/static/2025/two-pelicans.jpg...
I'll finish this in a PR.
Documentation: https://github.com/simonw/llm/blob/7645ab9a9586ad5c824d7ecaa950bfc8a27de1be/docs/python-api.md#tools-can-return-attachments
Here's a plugin for trying this out: https://github.com/simonw/llm-tools-image-from-url