goose icon indicating copy to clipboard operation
goose copied to clipboard

Add initial implementation for the Counsel of 9

Open simonsickle opened this issue 1 month ago • 2 comments

Summary

Counsel of 9 is a new feature that provides diverse perspectives on user questions by simulating a council of 9 AI personas with distinct personalities and values.

How it works:

  1. User submits a question/prompt
  2. Nine AI personas (Pragmatist, Visionary, Skeptic, Optimist, Analyst, Creative, Ethicist, Realist, Mediator) each provide their opinion
  3. Each persona votes for the best opinion (excluding their own) based on their values
  4. The opinion with the most votes is returned as the 'winning' answer

This gives a foundation for the future of adding things like multi-llm planning mode

NOTE: this can become very expensive if you are using a model like Opus or Gpt-5-pro...

Type of Change

  • [x] Feature
  • [ ] Bug fix
  • [ ] Refactor / Code quality
  • [ ] Performance improvement
  • [ ] Documentation
  • [ ] Tests
  • [ ] Security fix
  • [ ] Build / Release
  • [ ] Other (specify below)

Testing

Manual testing

Related Issues

Relates to #ISSUE_ID
Discussion: LINK (if any)

Screenshots/Demos (for UX changes)

Before:

After:
Screenshot 2025-10-31 at 2 26 08 PM Screenshot 2025-10-31 at 2 28 29 PM Screenshot 2025-10-31 at 2 27 02 PM

Alternate counsel members

Screenshot 2025-10-31 at 2 27 29 PM Screenshot 2025-10-31 at 2 27 35 PM

Email:

simonsickle avatar Oct 31 '25 18:10 simonsickle

An interesting approach! But feels tough to me to include in the core goose implementation given how many tokens it could use... What would you think about a version which implements the core approach in a set of MCP tools with an optional server people could turn on?

I did originally really want to make this a MCP with MCP-UI but I couldn’t find a reliable way to make calls to the different LLMs through a MCP… maybe I’m thinking about this architecture incorrectly, though as it’s new to me (happy to chat more on slack to get this to a better state). Because of voting and personas we have to spawn fresh conversations with the LLM providers, not just take the current chat and provide it as a tool call.

making it a MCP that could be toggled would be really ideal because I really wanted to have the ability to ask in chat what the best way to do something would be or to ask the council about something. The only reason I deviated from this was due to the above need of calling a LLM and goose core had that functionality built in where mcps don’t.

Because of the token use I did split it out of chat, which was a trade off I’d rather not make though

simonsickle avatar Nov 03 '25 20:11 simonsickle

The only reason I deviated from this was due to the above need of calling a LLM and goose core had that functionality built in where mcps don’t.

@simonsickle I implemented support for MCP sampling last week where the MCP server can use goose's model connection. It should just work if you emit sampling/createMessage requests from the server

alexhancock avatar Nov 03 '25 20:11 alexhancock

I think it would be great if we can turn this indeed into an MCP server and use the new sampling! going to close this for now

DOsinga avatar Nov 06 '25 04:11 DOsinga