Platform Extension MOIM (Minus One Info Message)
Implements ephemeral context injection at the agent boundary to automatically include timestamp and TODO content in LLM context without modifying conversation history.
Problem
Previously, accessing TODO content required explicit todo__read tool calls, cluttering conversation history and consuming tokens. Timestamps were injected via template variables in system prompts, creating inconsistency across different prompt variants and preventing effective prompt caching.
Solution
MOIM (Minus One Info Message) injects contextual information directly into the recent context during agent message preparation. This injection contains:
- Current timestamp (always)
- TODO content from session metadata (when available)
- Any other extension-provided context via the
get_moim()trait method
The injection happens in agent.rs at the message preparation stage, ensuring the context is included in the LLM call but never persisted to storage or returned in conversation history.
Architecture Benefits
- Zero changes to core data structures (Message, Conversation)
- No persistence layer modifications
- Clean extension boundaries via the
get_moim()trait method - No impact on existing sessions or conversation replay
- Extensions can opt-in to MOIM by implementing the trait method
Implementation Details
ExtensionManager::collect_moim()aggregates MOIM content from all active extensionsmoim::inject_moim()injects collected MOIM into context- TODO extension implements
get_moim()to provide task content - Timestamp injection moved from template variables to MOIM system
Testing
Comprehensive test coverage in crates/goose/tests/moim_tests.rs with proper test isolation using serial_test for tests that modify environment variables.
Migration
- Removed
todo__readtool (breaking change for recipes using it) - Updated
todo__writedescription to reflect automatic availability - System prompts now receive timestamp via MOIM instead of template variables
- Removed
{{current_date_time}}template variable from system prompts
User Impact
- TODO content is now automatically available in context after writing
- No need to explicitly read TODO content, reducing token usage
- Cleaner conversation history without repetitive read operations
- Consistent timestamp availability across all prompt variants
- I think you shouldn't bother with finding an insertion point. you are doing this just before sending it to the provider which means that if you make a mistake we'll get a catastrophic error. I think you want to prefix this to the user message just after:
if let Some(session_config) = &session {block. that way it doesn't go to the storage, but you also don't change the structure of the conversation. you probably don't need a moim.rs then either
Unfortunately, counting on the user message would make this useless for recipes. Long-context independent work is where Goose needs this feature the most
if this is adjusting the system message each time - doesn't that mean that there is no prompt caching? (as right at the start the content diverges?) - or do I misunderstand?
if this is adjusting the system message each time - doesn't that mean that there is no prompt caching? (as right at the start the content diverges?) - or do I misunderstand?
instead of doing that, it is trying to add a synthetic message to the conversation with changeable info towards the bottom of the conversation.
PR Preview Action v1.6.0 :---: |
:rocket: View preview athttps://block.github.io/goose/pr-preview/pr-5027/
|
Built to branch gh-pages at 2025-10-08 13:45 UTC.
Preview will be ready when the GitHub Pages deployment is complete.
@DOsinga can you take a look again? Think I hit your changes
Though, through benchmarking, I found I had to make the message positioning slightly more complicated again in order to get the best results. But just a little
hmm, wondering why my team got tagged in this. i dont see any docs changes
hmm, wondering why my team got tagged in this. i dont see any docs changes
My bad! I didn't merge main in correctly earlier and a change in the docs auto tagged your team in