feat: ai plugin refactor
English | 简体中文
PR
PR Checklist
Please check if your PR fulfills the following requirements:
- [x] The commit message follows our Commit Message Guidelines
- [ ] Tests for the changes have been added (for bug fixes / features)
- [ ] Docs have been added / updated (for bug fixes / features)
- [ ] Built its own designer, fully self-validated
PR Type
What kind of change does this PR introduce?
- [ ] Bugfix
- [x] Feature
- [ ] Code style update (formatting, local variables)
- [ ] Refactoring (no functional changes, no api changes)
- [ ] Build related changes
- [ ] CI related changes
- [ ] Documentation content changes
- [ ] Other... Please describe:
Background and solution
What is the current behavior?
Issue Number: N/A
What is the new behavior?
Does this PR introduce a breaking change?
- [ ] Yes
- [x] No
Other information
Summary by CodeRabbit
-
New Features
- Agent mode with streaming tool-calls, richer AI chat UI (history, sessions, file/image upload, retry/abort), new chat components and renderers for reasoning and images.
- Persisted model/settings UI and a configurable OpenAI-compatible provider for chat and streaming.
-
Bug Fixes
- Authorization header now always a string; canceled requests no longer show errors; safer icon selection when missing.
-
Refactor
- Centralized model/config composables and modernized chat internals for clearer state and extensibility.
Walkthrough
Restructures the robot plugin into a composable dual-mode AI system (Agent & Chat), adds streaming/chat/tool-call pipelines and an OpenAI-compatible provider, introduces model/service configuration, JSON-Patch schema streaming/repair, new UI components, and removes legacy JS utilities and prompts.
Changes
| Cohort / File(s) | Summary |
|---|---|
Core Plugin & Meta packages/plugins/robot/index.ts, packages/plugins/robot/meta.js, packages/plugins/robot/src/metas/index.ts |
Updated imports/exports and RobotService wiring; added meta options (customCompatibleAIModels, enableResourceContext, enableRagContext, encryptServiceApiKey, modeImplementation); RobotService.apis now uses useConfig(). |
Removed Legacy Modules packages/plugins/robot/src/js/useRobot.ts, packages/plugins/robot/src/js/prompts.ts, packages/plugins/robot/src/js/utils.ts, packages/plugins/robot/src/mcp/utils.ts, packages/plugins/robot/src/BuildLoadingRenderer.vue |
Deleted legacy Options-API state, hardcoded PROMPTS, old SSE/chat helpers, MCP utils, and a deprecated loading component. |
Core Composables & Adapters packages/plugins/robot/src/composables/core/useConfig.ts, .../useConversation.ts, .../useMessageStream.ts, .../pageUpdater.ts |
Added configuration management, conversation adapter, streaming message handler with delta processors, and throttled page schema updater for Agent mode. |
Mode System packages/plugins/robot/src/composables/modes/useMode.ts, .../useAgentMode.ts, .../useChatMode.ts |
Introduced mode registry and runtime dispatcher; implemented Agent and Chat mode hooks and lifecycles. |
Chat Workflow packages/plugins/robot/src/composables/useChat.ts |
Implemented streaming chat lifecycle, per-request abort handling, conversation management, tool-call orchestration hooks, and public chat API (send, abort, create/switch conversation). |
Tooling & MCP packages/plugins/robot/src/composables/features/useMcp.ts, .../useToolCalls.ts |
Reworked MCP integration (removed server connect stubs), changed tool listing return types, added sequential tool-call orchestration with hooks and abort support. |
Provider & Services packages/plugins/robot/src/services/OpenAICompatibleProvider.ts, .../aiClient.ts, .../agentServices.ts, .../api.ts |
Added OpenAI-compatible provider (fetch/axios + streaming), aiClient wiring, agent search/asset helpers, and consolidated apiService (chat, agent, resources, encrypt). |
UI — Chat & Renderers packages/plugins/robot/src/components/chat/RobotChat.vue, .../FooterButton.vue, .../renderers/*.vue, .../renderers/index.ts |
New RobotChat UI, FooterButton, and renderer components (Loading, Img, Agent, Markdown) with centralized exports. |
UI — Header & Footer Extensions packages/plugins/robot/src/components/header-extension/*, .../footer-extension/* |
Added History, RobotSetting, ServiceEditDialog; refactored McpServer to use FooterButton; RobotTypeSelect now driven by chatMode prop and emits typeChange; added re-export indexes. |
Icons & Indexes packages/plugins/robot/src/components/icons/*, .../icons/index.ts, .../renderers/index.ts |
Removed script exports from some SVGs, added centralized icon and renderer re-exports. |
Main Component Migration packages/plugins/robot/src/Main.vue |
Migrated to script-setup composition API and integrated new composables, mode-driven rendering, settings, history, and file upload flows. |
Types & Constants packages/plugins/robot/src/types/*.ts, packages/plugins/robot/src/constants/* |
Added types for modes, settings, MCP, chat; model-config catalog; prompt builders; re-exported constants. |
Utilities & Schema Helpers packages/plugins/robot/src/utils/chat.utils.ts, schema.utils.ts, meta.utils.ts, utils/index.ts |
New message formatting, SSE parsing, error serialization, schema auto-fix, JSON-Patch validation/repair, meta option getter; expanded utils barrel exports. |
Schema/Prompt Templates packages/plugins/robot/src/constants/prompts/templates/agent-prompt.md, .../index.ts, .../data/examples.json |
Added strict agent prompt template, dynamic prompt builders, and example payloads consumed by Agent mode. |
Layout & Canvas Adjustments packages/layout/src/DesignSettings.vue, packages/layout/src/Main.vue, packages/layout/src/composable/useLayout.ts, packages/canvas/container/src/components/CanvasResize.vue |
Removed top offset on right-panel, set right-wrap position relative, added toolbars.render state and watcher to recalc canvas scale on toolbar render changes. |
Misc. Integrations & Fixes packages/common/js/completion.js, packages/register/src/constants.ts, packages/plugins/resource/src/ResourceList.vue |
Consolidated meta-register imports, added META_SERVICE.Robot, and included description in resource batch payload. |
Package & Config packages/plugins/robot/package.json, tsconfig.app.json |
Bumped several robot packages from RC to stable, added @vueuse/core, reordered dependencies; tsconfig: "jsx": "preserve" and updated lib targets. |
Demo & Docs designer-demo/*, docs/* |
Suppressed canceled request errors in demo http helper, renamed demo snippets, moved useRobot implementation path, and replaced docs with a dual-mode usage guide. |
Small Component Fixes packages/configurator/src/select-icon-configurator/SelectIconConfigurator.vue |
Added optional chaining when resolving SVG icon creators to prevent runtime errors. |
Sequence Diagram(s)
sequenceDiagram
participant User
participant UI as Main / RobotChat
participant ChatFlow as useChat
participant ModeMgr as useMode
participant ModeImpl as Agent/Chat Mode
participant API as apiService / OpenAICompatibleProvider
participant MCP as MCP / useMcp
User->>UI: Submit message / upload image / click prompt
UI->>ChatFlow: sendUserMessage(payload)
ChatFlow->>ModeMgr: getCurrentMode()
ModeMgr-->>ChatFlow: ModeHooks instance
ChatFlow->>ModeImpl: onBeforeRequest(requestParams)
ModeImpl->>API: chatStream(requestData)
API-->>ModeImpl: stream chunks (content, tool_calls)
ModeImpl->>ChatFlow: onStreamData(chunk)
ChatFlow->>UI: update renderContent / messages
alt tool_calls present
ModeImpl->>MCP: callTools(tool_calls)
MCP-->>ModeImpl: tool results
ModeImpl->>API: stream tool-derived content (chatStream)
API-->>ModeImpl: tool stream chunks
ModeImpl->>ChatFlow: onStreamTools / onPostCallTools
ChatFlow->>UI: render tool results
end
ModeImpl->>ChatFlow: onRequestEnd(finishReason)
ChatFlow->>UI: finalize message state
Estimated code review effort
🎯 5 (Critical) | ⏱️ ~120 minutes
- Areas requiring extra attention:
- useChat.ts and conversation adapter: streaming delta handling, abortControllerMap concurrency, state race conditions.
- useConfig.ts: migration, persistence, encryptServiceApiKey behavior, and model/service merging correctness.
- useAgentMode.ts and pageUpdater: JSON-Patch parsing/repair, apply/rollback semantics, and throttled updates.
- OpenAICompatibleProvider & aiClient: SSE vs axios streaming paths, error mapping, and headers/auth handling.
- useToolCalls.ts and useMcp.ts: sequential tool-call orchestration, abort semantics, and MCP integration edge cases.
- New types & re-exports: check for circular imports and consistent type shapes across modules.
- Main.vue and new components: ensure public props/events compatibility and template-driven behavior.
Poem
🐰
I nibble code and stitch the streams,
Two modes awaken from my dreams,
Patches hop to paint the page,
Tools converse upon the stage,
A rabbit smiles — the UI beams!
Pre-merge checks and finishing touches
❌ Failed checks (1 warning)
| Check name | Status | Explanation | Resolution |
|---|---|---|---|
| Docstring Coverage | ⚠️ Warning | Docstring coverage is 75.00% which is insufficient. The required threshold is 80.00%. | You can run @coderabbitai generate docstrings to improve docstring coverage. |
✅ Passed checks (2 passed)
| Check name | Status | Explanation |
|---|---|---|
| Description Check | ✅ Passed | Check skipped - CodeRabbit’s high-level summary is enabled. |
| Title check | ✅ Passed | The title 'feat: ai plugin refactor' is directly related to the changeset, which involves a comprehensive refactor of the AI plugin with agent-mode improvements, new features, UX/UI updates, and bug fixes. |
✨ Finishing touches
- [ ] 📝 Generate docstrings
🧪 Generate unit tests (beta)
- [ ] Create PR with unit tests
- [ ] Post copyable unit tests in a comment
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.
Comment @coderabbitai help to get the list of available commands and usage tips.
模型配置需要支持配置额外传参,比如配置一个 qwen-plus-thinking 模型,可以配置额外带一个 { enable_thinking: true } 的参数
感觉可以增强一下代码结构,增强插件的可拓展性与可维护性,比如:
- 增强
useRobot.ts中的 AIModelOptions。提供默认配置,允许在AI 插件配置层级进行新增与隐藏。
好处:将大模型提供商+模型的静态配置集中放置,容易阅读+可维护;也方便后续集中提供配置进行新增或者隐藏
增强示例结构:
export default {
name: 'DeepSeek',
apiBase: 'https://api.deepseek.com/v1',
models: [
{
id: 'deepseek-chat',
name: 'deepseek-chat',
contextWindow: 65536, // 上下文大小
maxTokens: 8192,
defaultMaxTokens: 8000,
inputPrice: 0.0006, // 输入 token 价格
outputPrice: 0.002, // 输出 token 价格
isDefault: true,
description: `60 tokens/second, Enhanced capabilities,API compatibility intact`, // 描述
capabilities: { // 模型能力
tools: {
enabled: true,
},
},
},
]
}
- 增强 modelProvider 能力。(clients/index.ts、clients/OpenAICompatibleProvider.ts) 当前我们提供了基础的 openai compatible 的 modelProvider。 建议:
- 集成处理 tool_call 的能力(当前处理 tool_call 放置在了 useChat.ts 中)
- 将 agent模式/chat 模式的处理,抽离出来。
好处:不同的大模型提供商、甚至不同的大模型的 tool_call 格式、以及传参可能都有细微的差别,我们将通用的处理模式全部内聚到一个 provider 里面,后续如果有定制化的需求,直接开放配置出来,让二开用户传入自己的 provider 即可处理 tool_call 格式、传参的相关差异。
总结:增强扩展性+高内聚
- 增加模式处理器(agent Mode、chat Mode、plan Mode 等等) 不同的模式,可能 system prompt、可以调用的工具、可以调用的大模型不同。原本的一些处理我们在 useChat 和 modelProvider 中都有散落处理,可以考虑抽离一个 chatMode 之类的处理器,内聚处理不同模式的差异,然后再传递给 modelProvider。做到下层无感知。
Bot detected the issue body's language is not English, translate it automatically.
It feels like the code structure can be enhanced to enhance the scalability and maintainability of the plug-in, such as:
- Enhance AIModelOptions in
useRobot.ts. Provides default configuration, allowing adding and hiding at the AI plug-in configuration level.
Benefits: Centralize the static configuration of large model providers and models, making it easy to read and maintain; it also facilitates subsequent centralized provision of configurations for adding or hiding
Enhanced example structure:
export default {
name: 'DeepSeek',
apiBase: 'https://api.deepseek.com/v1',
models: [
{
id: 'deepseek-chat',
name: 'deepseek-chat',
contextWindow: 65536, //Context size
maxTokens: 8192,
defaultMaxTokens: 8000,
inputPrice: 0.0006, //Input token price
outputPrice: 0.002, // Output token price
isDefault: true,
description: `60 tokens/second, Enhanced capabilities,API compatibility intact`, // description
capabilities: { // Model capabilities
tools: {
enabled: true,
},
},
},
]
}
- Enhance modelProvider capabilities. (clients/index.ts, clients/OpenAICompatibleProvider.ts) Currently we provide a basic openai compatible modelProvider. suggestion:
- Integrated ability to handle tool_call (currently processing tool_call is placed in useChat.ts)
- Extract the processing of agent mode/chat mode.
Benefits: Different large model providers, or even different large models, may have subtle differences in the tool_call format and parameter passing. We have integrated all common processing modes into one provider. If there is a need for customization in the future, the configuration can be directly opened, allowing secondary users to pass in their own providers to handle the differences in tool_call format and parameter passing.
Summary: Enhanced scalability + high cohesion
- Add mode processor (agent Mode, chat Mode, plan Mode, etc.) Different modes may have different system prompts, tools that can be called, and large models that can be called. We have scattered some of the original processing in useChat and modelProvider. We can consider extracting a processor such as chatMode to cohesively handle the differences in different modes, and then pass it to modelProvider. Make sure the lower level is insensitive.
感觉可以增强一下代码结构,增强插件的可拓展性与可维护性,比如:
- 增强
useRobot.ts中的 AIModelOptions。提供默认配置,允许在AI 插件配置层级进行新增与隐藏。好处:将大模型提供商+模型的静态配置集中放置,容易阅读+可维护;也方便后续集中提供配置进行新增或者隐藏
增强示例结构:
export default { name: 'DeepSeek', apiBase: 'https://api.deepseek.com/v1', models: [ { id: 'deepseek-chat', name: 'deepseek-chat', contextWindow: 65536, // 上下文大小 maxTokens: 8192, defaultMaxTokens: 8000, inputPrice: 0.0006, // 输入 token 价格 outputPrice: 0.002, // 输出 token 价格 isDefault: true, description: `60 tokens/second, Enhanced capabilities,API compatibility intact`, // 描述 capabilities: { // 模型能力 tools: { enabled: true, }, }, }, ] }
- 增强 modelProvider 能力。(clients/index.ts、clients/OpenAICompatibleProvider.ts) 当前我们提供了基础的 openai compatible 的 modelProvider。 建议:
- 集成处理 tool_call 的能力(当前处理 tool_call 放置在了 useChat.ts 中)
- 将 agent模式/chat 模式的处理,抽离出来。
好处:不同的大模型提供商、甚至不同的大模型的 tool_call 格式、以及传参可能都有细微的差别,我们将通用的处理模式全部内聚到一个 provider 里面,后续如果有定制化的需求,直接开放配置出来,让二开用户传入自己的 provider 即可处理 tool_call 格式、传参的相关差异。
总结:增强扩展性+高内聚
- 增加模式处理器(agent Mode、chat Mode、plan Mode 等等) 不同的模式,可能 system prompt、可以调用的工具、可以调用的大模型不同。原本的一些处理我们在 useChat 和 modelProvider 中都有散落处理,可以考虑抽离一个 chatMode 之类的处理器,内聚处理不同模式的差异,然后再传递给 modelProvider。做到下层无感知。
已支持
Bot detected the issue body's language is not English, translate it automatically.
I feel that the code structure can be enhanced to enhance the scalability and maintainability of the plug-in, such as:
- Enhance AIModelOptions in
useRobot.ts. Provides default configuration, allowing adding and hiding at the AI plug-in configuration level.Benefits: Centralize the static configuration of large model providers and models, making it easy to read and maintain; it also facilitates the subsequent centralized provision of configurations for adding or hiding
Enhanced example structure:
export default { name: 'DeepSeek', apiBase: 'https://api.deepseek.com/v1', models: [ { id: 'deepseek-chat', name: 'deepseek-chat', contextWindow: 65536, //Context size maxTokens: 8192, defaultMaxTokens: 8000, inputPrice: 0.0006, // Enter token price outputPrice: 0.002, // Output token price isDefault: true, description: `60 tokens/second, Enhanced capabilities,API compatibility intact`, // description capabilities: { // Model capabilities tools: { enabled: true, }, }, }, ] }
- Enhance modelProvider capabilities. (clients/index.ts, clients/OpenAICompatibleProvider.ts) Currently we provide a basic openai compatible modelProvider. Suggestions:
- Integrate the ability to handle tool_call (currently processing tool_call is placed in useChat.ts)
- Extract the processing of agent mode/chat mode.
Benefits: Different large model providers, or even different large models, may have subtle differences in the tool_call format and parameter passing. We have integrated all common processing modes into one provider. If there is a need for customization in the future, the configuration can be directly opened, allowing secondary users to pass in their own providers to handle the differences in tool_call format and parameter passing.
Summary: Enhanced scalability + high cohesion
- Add mode processor (agent Mode, chat Mode, plan Mode, etc.) Different modes may have different system prompts, tools that can be called, and large models that can be called. We have scattered some of the original processing in useChat and modelProvider. We can consider extracting a processor such as chatMode to cohesively handle the differences in different modes, and then pass it to modelProvider. Make sure the lower level is insensitive.
Already supported
模型配置需要支持配置额外传参,比如配置一个 qwen-plus-thinking 模型,可以配置额外带一个 `{ enable_thinking: true }` 的参数
已支持
模型配置需要支持配置额外传参,比如配置一个 qwen-plus-thinking 模型,可以配置额外带一个 `{ enable_thinking: true }` 的参数