Simon Willison

Results 2692 comments of Simon Willison

I think this is the migration: ```python @migration def m018_replies(db): db["responses"].add_column("reply_to_id", str) db["responses"].add_foreign_key("reply_to_id", "responses", "id") db["responses"].transform( column_order=( "id", "reply_to_id", "model", "prompt", "system", "prompt_id", "system_id", "schema_id", "prompt_json", "options_json", "response", "response_json", "conversation_id",...

I guess this means the `Prompt()` class constructor needs to be able to take a `reply_to_id=` argument, or a `reply_to=Response()` argument, or both? Would be neater to just to `reply_to=response`...

Also interesting: currently the `.execute()` method (including in all the plugins) has this signature: https://github.com/simonw/llm/blob/e78e1fceb273aeed467d80ec6c1c710a1433d3c1/llm/default_plugins/openai_models.py#L573-L578 A lot of those then have methods like this one: https://github.com/simonw/llm/blob/e78e1fceb273aeed467d80ec6c1c710a1433d3c1/llm/default_plugins/openai_models.py#L468-L480 Note how `conversation` is...

This change could be a breaking change for existing plugins. That's worth thinking about - it may be possible to keep them working by detecting if their `execute()` method takes...

The `.execute()` signature is a bit of a mess already, perhaps I rename that method to some new name to allow for a fresh design entirely? Current docs: https://github.com/simonw/llm/blob/e78e1fceb273aeed467d80ec6c1c710a1433d3c1/docs/plugins/tutorial-model-plugin.md#L224 >...

I'm already reconsidering what `.execute()` does a bit for annotations in: - #716

Now that I've built this: - https://github.com/simonw/llm-fragments-github/issues/3 I can try this: ```bash llm -f github:simonw/llm \ -f issue:simonw/llm/938 \ -m gemini-2.5-pro-exp-03-25 \ --system 'muse on this issue, then propose a...

I think the first step here is going to be designing and adding a tool definition abstraction to the `Prompt` class. Next step is the same thing but for tool...

I don't plan to implement MCP directly in LLM core, but I anticipate building a plugin that adds MCP support to LLM and builds on top of the new tools...

That's a really good call. I was going to leave that entirely up to plugins but it would make sense for the core library to include an easy "make this...