feat(ci): add llm issue triage workflow
Summary
- add configurable LLM issue triage workflow with dual triggers, org check and inputs
- implement provider-agnostic triage script with labeling, tracking, and bulk processing
- include repo-wide config file for labels, rate limits, and skip rules
Testing
- node --check scripts, npm install (see logs)
- dry run on issue #38751 via workflow script
[!WARNING] Tests have not run on the HEAD 9b6328bd243fe8f2234c6987bc656a1c397dfa8a yet
Wed, 26 Nov 2025 08:04:29 UTC
Context: https://www.notion.so/appsmith/V1-Community-Auto-Maintenance-AI-Assisted-Contribution-Workflow-2b7fe271b0e2801f8915dd2a334ceaf5?source=copy_link
Summary by CodeRabbit
- New Features
- Added AI-powered GitHub issue triage system with automatic complexity analysis and labeling
- Support for multiple LLM providers (OpenAI, Gemini, Anthropic)
- Configurable triage workflow with manual and automatic triggering options
- Automated issue filtering, analysis, and comment generation based on AI insights
- Rate limiting and permission-based controls for organized triage operations
βοΈ Tip: You can customize this high-level summary in your review settings.
Walkthrough
Introduces an LLM-based GitHub issue triage system via configuration, a GitHub Actions workflow, orchestration layer, analysis module, GitHub labeler, and three pluggable LLM provider implementations (OpenAI, Gemini, Anthropic) for automated issue classification and labeling.
Changes
| Cohort / File(s) | Summary |
|---|---|
Configuration & Package Setup .github/issue-triage-config.yml, .github/workflows/scripts/llm-triage/package.json |
Defines triage rules, label mappings, complexity levels, skip/target logic, and rate limiting; establishes Node.js package metadata with LLM SDK dependencies (openai, gemini, anthropic, octokit). |
Workflow & Orchestration .github/workflows/issue-triage.yml, .github/workflows/scripts/llm-triage/index.js |
GitHub Actions workflow handling authorization checks, environment setup, and invocation; main entry point dispatches to single or bulk issue processing with workflow_dispatch and issue event support. |
Core Analysis & Labeling .github/workflows/scripts/llm-triage/analyzer.js, .github/workflows/scripts/llm-triage/labeler.js, .github/workflows/scripts/llm-triage/config.js |
IssueAnalyzer orchestrates LLM-driven issue analysis with codebase context and prompt building; GitHubLabeler applies triage results, posts comments, and manages labels; config module loads YAML rules and merges environment overrides. |
LLM Provider Abstraction .github/workflows/scripts/llm-triage/providers/base.js, .github/workflows/scripts/llm-triage/providers/openai.js, .github/workflows/scripts/llm-triage/providers/gemini.js, .github/workflows/scripts/llm-triage/providers/anthropic.js, .github/workflows/scripts/llm-triage/providers/index.js |
BaseLLMProvider defines contract for response parsing and validation; three concrete providers implement LLM integrations (gpt-4o, gemini-1.5-pro, claude-sonnet-4); factory function enables provider instantiation by name. |
Sequence Diagrams
sequenceDiagram
participant GitHub as GitHub Actions
participant Workflow as Workflow (issue-triage.yml)
participant Index as index.js
participant Config as config.js
participant Analyzer as IssueAnalyzer
participant Provider as LLM Provider
participant Labeler as GitHubLabeler
participant GitHubAPI as GitHub API
GitHub->>Workflow: Trigger (dispatch/issue event)
Workflow->>Workflow: Check authorization
Workflow->>Index: Execute with env vars
Index->>Config: Load config & API keys
Index->>Analyzer: Create instance
Index->>Labeler: Create instance
alt Workflow Dispatch - Single Issue
Index->>Index: Fetch issue
else Workflow Dispatch - Bulk
Index->>GitHubAPI: Query issues by labels
else Issue Event
Index->>Index: Use event issue
end
Index->>Analyzer: analyzeIssue(issue, context)
Analyzer->>Analyzer: Build prompt & codebase context
Analyzer->>Provider: analyze(prompt, context)
Provider->>Provider: Call LLM API
Provider->>Analyzer: Return TriageResult
Analyzer->>Index: Return parsed result
Index->>Labeler: applyTriageResult(issue, result)
Labeler->>GitHubAPI: Post comment
Labeler->>GitHubAPI: Add labels
Labeler->>Index: Confirm applied
Index->>Workflow: Summary & logging
Workflow->>GitHub: Complete
sequenceDiagram
participant Factory as createProvider()
participant ApiKeys as apiKeys Object
Factory->>ApiKeys: Lookup key for provider name
alt Key exists
Factory->>Factory: Instantiate provider
Factory->>Factory: Return provider instance
else Key missing
Factory->>Factory: Throw error (missing API key)
else Unknown provider
Factory->>Factory: Throw error (unsupported)
end
Estimated code review effort
π― 4 (Complex) | β±οΈ ~45 minutes
- Async coordination: Multiple async flows (workflow events, GitHub API calls, LLM provider calls) with error handling across layers
- Provider pattern implementation: Factory function and three distinct provider integrations require validation of consistent interface adherence and response parsing
- GitHub API interactions: Label application, comment posting, issue querying with conditional logic and error recovery
- Configuration merging & environment overrides: Deep merge logic and normalization of skip/target labels
- Control flow branching: Distinct paths for workflow_dispatch (single/bulk), issue events, and re-analysis logic
Areas requiring extra attention:
- Error handling consistency across all three LLM providers and API calls
- Response parsing and validation in
parseResponse/validateTriageResultacross providers - Label filtering and application logic in
GitHubLabeler.applyTriageResult(conditional complexity label, valid label filtering) - Rate limiting and delay implementation in bulk processing
- Codebase context extraction accuracy and keyword-to-file mappings
Poem
π€ Issues swirl in the GitHub tide,
LLM triage takes them in stride,
OpenAI, Gemini, Claude as guideβ
Labels applied, complexity spied,
Automation and wisdom collide! β¨
Pre-merge checks and finishing touches
β Failed checks (1 inconclusive)
| Check name | Status | Explanation | Resolution |
|---|---|---|---|
| Description check | β Inconclusive | The description includes a summary of changes and testing details but lacks the issue reference (Fixes #) and DevRel/Marketing checkbox from the template. | Add 'Fixes #' with the related issue number and complete the Communication checklist to fully align with the repository template. |
β Passed checks (2 passed)
| Check name | Status | Explanation |
|---|---|---|
| Title check | β Passed | The title accurately describes the main change: adding an LLM-based issue triage workflow to CI configuration. |
| Docstring Coverage | β Passed | Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%. |
β¨ Finishing touches
- [ ] π Generate docstrings
π§ͺ Generate unit tests (beta)
- [ ] Create PR with unit tests
- [ ] Post copyable unit tests in a comment
- [ ] Commit unit tests in branch
feature/llm-triage-poc
[!TIP]
π Customizable high-level summaries are now available in beta!
You can now customize how CodeRabbit generates the high-level summary in your pull requests β including its content, structure, tone, and formatting.
- Provide your own instructions using the
high_level_summary_instructionssetting.- Format the summary however you like (bullet lists, tables, multi-section layouts, contributor stats, etc.).
- Use
high_level_summary_in_walkthroughto move the summary from the description to the walkthrough section.Example instruction:
"Divide the high-level summary into five sections:
- π Description β Summarize the main change in 50β60 words, explaining what was done.
- π References β List relevant issues, discussions, documentation, or related PRs.
- π¦ Dependencies & Requirements β Mention any new/updated dependencies, environment variable changes, or configuration updates.
- π Contributor Summary β Include a Markdown table showing contributions:
| Contributor | Lines Added | Lines Removed | Files Changed |- βοΈ Additional Notes β Add any extra reviewer context. Keep each section concise (under 200 words) and use bullet or numbered lists for clarity."
Note: This feature is currently in beta for Pro-tier users, and pricing will be announced later.
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.
Comment @coderabbitai help to get the list of available commands and usage tips.
This PR has not seen activitiy for a while. It will be closed in 7 days unless further activity is detected.
This PR has been closed because of inactivity.