moq.analyzers icon indicating copy to clipboard operation
moq.analyzers copied to clipboard

Refactor PowerShell Performance Testing Scripts into Modular Architecture

Open coderabbitai[bot] opened this issue 3 months ago • 0 comments
trafficstars

Problem Statement

The performance testing PowerShell scripts in build/scripts/perf/ have grown organically and now exhibit several architectural issues that impact maintainability:

Current Issues Identified

  1. Code Duplication: The Show-Invocation function is duplicated between DiffPerfToBaseline.ps1 and PerfCore.ps1
  2. No Modular Structure: All functionality is embedded within individual scripts (400 total lines across 4 scripts)
  3. Inconsistent Error Handling: Different scripts use varying approaches to error handling and logging
  4. Path Resolution Duplication: Multiple scripts duplicate the same RepoRoot resolution logic
  5. No Shared Constants: Hard-coded paths and configuration scattered across files

Current Script Structure

  • PerfCore.ps1 (151 lines) - Main entry point called from GitHub Actions
  • DiffPerfToBaseline.ps1 (154 lines) - Orchestrates baseline comparison workflow
  • RunPerfTests.ps1 (60 lines) - Executes performance benchmarks
  • ComparePerfResults.ps1 (35 lines) - Analyzes and compares benchmark results

Dependencies Discovered

PerfCore.ps1 (GitHub Actions Entry Point)
├── DiffPerfToBaseline.ps1 (when -diff flag used)
│   ├── RunPerfTests.ps1 (for baseline and current runs)
│   └── ComparePerfResults.ps1 (for result analysis)
└── RunPerfTests.ps1 (when -diff flag not used)

Proposed Solution: Modular PowerShell Architecture

Phase 1: Create Shared Utility Module

Create build/scripts/perf/PerfUtils.psm1 with shared functions:

  • Show-Invocation - Command invocation logging
  • Get-RepoRoot - Standardized repository root resolution
  • Ensure-Folder - Directory creation utility
  • Write-PerfLog - Standardized logging with consistent formatting
  • Invoke-PerfCommand - Centralized command execution with error handling

Phase 2: Extract Configuration Management

Create build/scripts/perf/PerfConfig.psm1 for:

  • Default paths and constants
  • Parameter validation and normalization
  • Environment detection (CI vs local)

Phase 3: Create Core Business Logic Modules

PerfTestRunner.psm1

  • Invoke-PerformanceTests - Core benchmark execution logic
  • Get-BenchmarkProjects - Project discovery and filtering

PerfResultsAnalyzer.psm1

  • Compare-PerformanceResults - Result comparison logic
  • Export-PerformanceReport - Report generation

PerfBaselineManager.psm1

  • Get-BaselineResults - Baseline retrieval and caching
  • New-BaselineWorkspace - Git worktree management for baselines

Phase 4: Refactor Entry Point Scripts

Transform existing scripts into thin orchestration layers that:

  • Import required modules
  • Parse and validate parameters
  • Orchestrate module function calls
  • Handle top-level error scenarios

Implementation Plan

Step 1: Analysis and Planning ✅

  • [x] Inventory existing scripts and functions
  • [x] Map dependencies and call patterns
  • [x] Identify shared functionality and duplication
  • [x] Design modular architecture

Step 2: Create Shared Utilities Module

  • [ ] Create PerfUtils.psm1 with Show-Invocation function
  • [ ] Add Get-RepoRoot, Ensure-Folder, and logging utilities
  • [ ] Add comprehensive unit tests for utility functions
  • [ ] Update DiffPerfToBaseline.ps1 to import and use the module
  • [ ] Update PerfCore.ps1 to import and use the module

Step 3: Extract Configuration Management

  • [ ] Create PerfConfig.psm1 with default paths and constants
  • [ ] Add parameter validation and environment detection functions
  • [ ] Update all scripts to use centralized configuration

Step 4: Create Business Logic Modules

  • [ ] Create PerfTestRunner.psm1 and extract benchmark execution logic
  • [ ] Create PerfResultsAnalyzer.psm1 and extract comparison logic
  • [ ] Create PerfBaselineManager.psm1 and extract baseline management
  • [ ] Add comprehensive unit tests for all business logic modules

Step 5: Refactor Entry Point Scripts

  • [ ] Simplify PerfCore.ps1 to orchestration-only
  • [ ] Simplify DiffPerfToBaseline.ps1 to workflow orchestration
  • [ ] Simplify RunPerfTests.ps1 and ComparePerfResults.ps1 to thin wrappers
  • [ ] Maintain backward compatibility for existing GitHub Actions integration

Step 6: Testing and Validation

  • [ ] Add unit tests for all modules using Pester testing framework
  • [ ] Add integration tests for end-to-end workflows
  • [ ] Validate GitHub Actions integration works unchanged
  • [ ] Performance regression testing on the refactored architecture

Step 7: Documentation and Cleanup

  • [ ] Create/update build/scripts/perf/README.md with new architecture
  • [ ] Add inline documentation and examples for all public module functions
  • [ ] Remove duplicated code and consolidate error handling
  • [ ] Add module manifest files for proper PowerShell module structure

Success Criteria

  • [ ] Zero code duplication between performance testing scripts
  • [ ] All shared functionality consolidated into reusable modules
  • [ ] Consistent error handling and logging across all scripts
  • [ ] Comprehensive unit test coverage (>80%) for all modules
  • [ ] GitHub Actions integration continues to work without modification
  • [ ] Reduced total lines of code through consolidation
  • [ ] Improved debugging experience with consistent Show-Invocation logging

Risks and Mitigation

Risk: Breaking GitHub Actions integration Mitigation: Maintain existing script interfaces and parameter contracts

Risk: Introducing new bugs during refactoring
Mitigation: Comprehensive testing strategy with both unit and integration tests

Risk: PowerShell module loading complexity Mitigation: Use explicit Import-Module statements and test module loading scenarios

Resources


Backlinks:

  • PR: https://github.com/rjmurillo/moq.analyzers/pull/571
  • Comment: https://github.com/rjmurillo/moq.analyzers/pull/571#discussion_r2221235499
  • Requested by: @rjmurillo

coderabbitai[bot] avatar Jul 22 '25 05:07 coderabbitai[bot]