chatGPTBox icon indicating copy to clipboard operation
chatGPTBox copied to clipboard

Refactor build pipeline for performance and dev output reliability

Open PeterDaveHello opened this issue 7 months ago • 47 comments

User description

Replace JS/CSS minifiers with esbuild to reduce build times while preserving outputs.

Enable thread-loader by default in dev and production. Choose workers dynamically (based on CPU cores) and allow overrides via environment variables. Keep filesystem cache enabled and make cache compression configurable, defaulting to uncompressed for faster warm builds on CPU-bound machines. Add BUILD_PARALLEL toggle (default on) to switch between parallel and sequential production variant builds. Ensure watch-once dev builds exit cleanly.

Adopt sass-embedded for SASS processing. In development, use style-loader to speed up CSS/SCSS compilation while keeping production outputs unchanged. Maintain CSP-safe source maps for extension contexts and suppress CSS 404 noise in development outputs.

Additionally, dependency caching has been added to the GitHub Actions workflow to accelerate CI/CD runs.

Results on this DO VPS (2 cores, ~4 GiB RAM):

  • Production first run (cold): ~44s (baseline ~105s)
  • Production second run (warm): ~19s (baseline ~39s)
  • Development first run: ~31s; second run: ~29s

Times vary by environment; numbers above are for this machine.

Pull request summary from GitHub Copilot:

This pull request introduces several significant improvements to the build process, focusing on performance, configurability, and developer experience. The main changes include switching to faster build tools, adding advanced caching and parallelization options, updating build scripts, and improving CI build efficiency. These updates should result in faster builds, better resource utilization, and more flexibility for both local development and CI environments.

Build Performance and Tooling Improvements

  • Switched JS minification from Terser to esbuild (via esbuild-loader) and enabled esbuild-based CSS minification for much faster production builds. (build.mjs, package.json) [1] [2]
  • Added support for parallel builds of production variants, controlled via the BUILD_PARALLEL environment variable, and enabled thread-loader for Babel by default (configurable via BUILD_THREAD and BUILD_THREAD_WORKERS). (build.mjs, .github/copilot-instructions.md) [1] [2] [3]
  • Introduced advanced Webpack cache options, allowing filesystem cache compression to be toggled with BUILD_CACHE_COMPRESSION for improved warm build performance. (build.mjs, .github/copilot-instructions.md) [1] [2]

Build Script and Output Enhancements

  • Updated the build script to support both parallel and sequential builds, improved output directory management, and ensured correct copying of build artifacts for both production and development. (build.mjs) [1] [2]
  • In development mode, the script now creates placeholder CSS files to prevent 404 errors in the browser. (build.mjs) [1] [2]

Continuous Integration (CI) Improvements

  • Added npm and Webpack cache steps to the pre-release GitHub Actions workflow to speed up CI builds. (.github/workflows/pre-release-build.yml)

Dependency Updates

  • Added new dependencies: esbuild, esbuild-loader, sass-embedded, style-loader, and thread-loader; updated sass and sass-loader to latest versions; removed terser-webpack-plugin as it's no longer used. (package.json) [1] [2]

Documentation

  • Expanded the build instructions to document all new performance-related environment variables and their effects. (.github/copilot-instructions.md)

PR Type

Enhancement


Description

  • Replace Terser/CSS minifiers with esbuild for faster builds

  • Enable thread-loader by default with configurable parallelism

  • Add filesystem cache compression control and parallel build options

  • Switch to sass-embedded and style-loader for dev builds

  • Add GitHub Actions dependency caching for CI acceleration


Diagram Walkthrough

flowchart LR
  A["Old Build System"] --> B["esbuild Minification"]
  A --> C["Thread-loader Parallelism"]
  A --> D["Advanced Webpack Caching"]
  A --> E["sass-embedded Processing"]
  B --> F["Faster Production Builds"]
  C --> F
  D --> F
  E --> F
  G["GitHub Actions"] --> H["Dependency Caching"]
  H --> I["Faster CI Builds"]

File Walkthrough

Relevant files
Enhancement
build.mjs
Major build system optimization with esbuild and threading

build.mjs

  • Replace Terser with esbuild for JS/CSS minification
  • Add thread-loader with configurable worker count
  • Implement filesystem cache compression control
  • Enable parallel/sequential build modes via BUILD_PARALLEL
  • Switch to sass-embedded and conditional style-loader usage
  • Add watch-once mode for development builds
+167/-51
pre-release-build.yml
Add dependency and build caching to CI workflow                   

.github/workflows/pre-release-build.yml

  • Add npm cache using actions/cache@v4
  • Add Webpack filesystem cache for build acceleration
  • Cache both ~/.npm and .cache/webpack directories
+16/-0   
Dependencies
package.json
Update dependencies for build optimization                             

package.json

  • Add esbuild and esbuild-loader dependencies
  • Replace terser-webpack-plugin with esbuild-loader
  • Update sass to sass-embedded with newer sass-loader
  • Add style-loader and thread-loader dependencies
+7/-3     
Documentation
copilot-instructions.md
Document new build performance configuration options         

.github/copilot-instructions.md

  • Document new build performance environment variables
  • Explain BUILD_PARALLEL, BUILD_THREAD, BUILD_CACHE_COMPRESSION options
  • Add BUILD_WATCH_ONCE documentation for development
+16/-0   

Summary by CodeRabbit

  • Chores
    • Improved pre-release CI: Node-major-aware cache keys and enabled npm + Webpack caching to speed up builds.
  • Refactor
    • Switched to esbuild-based JS/CSS minification and updated Sass/CSS loader pipeline.
    • Added parallel, variant-aware builds with better caching, threading, and more robust build/finalization behavior.
  • Documentation
    • Added Build Performance Options documenting env vars for parallelism, threading, cache compression, watch-once, source maps, and output layout.

PeterDaveHello avatar Aug 30 '25 20:08 PeterDaveHello

[!NOTE]

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

The PR updates the pre-release CI workflow to add Node-aware npm and Webpack caching, overhauls build.mjs to introduce variant-aware, parallelizable builds with esbuild-based minification and threaded loaders, adjusts package.json dependencies accordingly, and expands AGENTS.md with build performance and output structure documentation.

Changes

Cohort / File(s) Summary
CI workflow caching
.github/workflows/pre-release-build.yml
Adds Node major detection and exports NODE_MAJOR; enables npm cache via actions/setup-node; adds actions/cache for Webpack filesystem cache keyed by OS, NODE_MAJOR, and lockfile hash; restores/saves Webpack cache; build commands unchanged.
Build system core overhaul
build.mjs
Introduces variant-aware, optionally parallel build orchestration; updates runWebpack and finishOutput signatures to accept sourceBuildDir; switches to esbuild-based JS/CSS minification; conditional thread-loader support and thread pool controls; dynamic sass-embedded import with fallback; per-variant filesystem caches and output dirs; improved error handling, streaming zipping, file copying, and dev CSS placeholders/source maps.
Dependencies and tooling
package.json
Adds @babel/runtime; adds esbuild and esbuild-loader; upgrades/changes Sass tooling (sass, sass-embedded, sass-loader); adds style-loader and thread-loader; removes terser-webpack-plugin.
Documentation updates
AGENTS.md
Adds Build Performance Options documenting BUILD_PARALLEL, BUILD_THREAD, BUILD_THREAD_WORKERS, BUILD_CACHE_COMPRESSION, BUILD_WATCH_ONCE, BUILD_POOL_TIMEOUT; notes source map behavior, resolve.symlinks rationale, and detailed production output structure and variants.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor Dev as Developer / CI
  participant Build as build.mjs
  participant WP as Webpack Compiler
  participant FS as Filesystem
  participant Zip as Zipper

  Dev->>Build: build()
  activate Build
  Build->>Build: derive variants + env (parallel, threads, cache)
  alt Parallel enabled
    par For each variant
      Build->>WP: runWebpack(..., sourceBuildDir)
      WP-->>FS: emit assets to per-variant dir
      Build->>Build: finishOutput(suffix, sourceBuildDir)
      Build->>Zip: zipFolder(...)
    and For next variant
    end
  else Sequential
    loop variants
      Build->>WP: runWebpack(..., sourceBuildDir)
      WP-->>FS: emit assets to per-variant dir
      Build->>Build: finishOutput(suffix, sourceBuildDir)
      Build->>Zip: zipFolder(...)
    end
  end
  Build-->>Dev: exit 0 or error (non-zero)
  deactivate Build
sequenceDiagram
  autonumber
  actor CI as GitHub Actions
  participant Setup as actions/setup-node
  participant CacheNpm as npm cache (setup-node)
  participant CacheWP as actions/cache (Webpack)
  participant Job as build job

  CI->>Setup: setup-node with cache: npm
  Setup->>CacheNpm: restore by OS + lockfile hash
  CI->>Job: Detect NODE_MAJOR and export
  CI->>CacheWP: restore .cache/webpack using OS + NODE_MAJOR + lockfile hash
  CI->>Job: npm ci && npm run build
  Job-->>CacheWP: save updated Webpack cache (post)

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

  • ChatGPTBox-dev/chatGPTBox#891 — Modifies AGENTS.md; likely overlaps in documentation sections and contributor guidance.

Suggested labels

Review effort 2/5

Poem

I thump my paws—cache warms the trail,
Threads hum softly, builds set sail.
Esbuild breezes, Sass in tune,
Variants dance from noon to moon.
Zip! goes the bundle—swift and light. 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 25.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title succinctly and accurately reflects the primary change: a refactor of the build pipeline aimed at performance improvements (esbuild, caching, parallel/threaded builds) and improved developer output reliability (watch-once behavior, dev CSS placeholders, stable source maps). It is concise, specific, and clear enough for a teammate scanning PR history to understand the main intent.
✨ Finishing touches
  • [ ] 📝 Generate Docstrings
🧪 Generate unit tests
  • [ ] Create PR with unit tests
  • [ ] Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

coderabbitai[bot] avatar Aug 30 '25 20:08 coderabbitai[bot]

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 4 🔵🔵🔵🔵⚪
🧪 No relevant tests
🔒 No security concerns identified
⚡ Recommended focus areas for review

Possible Issue

The new runWebpack signature adds sourceBuildDir before callback, but many internal calls pass parameters that may shift positions incorrectly if any call sites or future edits assume the old order. Validate all invocations use the new 5-arg form consistently and that sourceBuildDir is always a string when expected.

async function runWebpack(isWithoutKatex, isWithoutTiktoken, minimal, sourceBuildDir, callback) {
  const shared = [
    'preact',
    'webextension-polyfill',
    '@primer/octicons-react',
    'react-bootstrap-icons',
    'countries-list',
Watch Exit Logic

In dev mode, compiler.watch closes only when BUILD_WATCH_ONCE is set, but callback still runs after close is requested. Confirm no race or partial close causes unresolved promises or dangling processes in CI. Consider handling watching.close completion before resolving.

if (isProduction) compiler.run(callback)
else {
  const watching = compiler.watch({}, (err, stats) => {
    callback(err, stats)
    if (process.env.BUILD_WATCH_ONCE) watching.close(() => {})
  })
}
Cache Key Specificity

Webpack cache key uses only package-lock.json hash; config/env-driven build flags (e.g., BUILD_CACHE_COMPRESSION, BUILD_THREAD_WORKERS) won’t bust cache. Validate cache correctness across differing CI env vars or include build.mjs hash/env in the cache key.

- name: Cache Webpack filesystem cache
  uses: actions/cache@v4
  with:
    path: |
      .cache/webpack
      node_modules/.cache/webpack
    key: ${{ runner.os }}-webpack-${{ hashFiles('**/package-lock.json') }}
    restore-keys: |
      ${{ runner.os }}-webpack-

qodo-code-review[bot] avatar Aug 30 '25 20:08 qodo-code-review[bot]

PR Code Suggestions ✨

Latest suggestions up to 5663e3c

CategorySuggestion                                                                                                                                    Impact
Possible issue
Ensure production failures exit non-zero

In production, if the callback throws or returns a rejected promise, the process
may succeed silently because only process.exitCode is set on close errors. Set
process.exitCode = 1 when stats contain errors or when the callback fails,
ensuring CI fails reliably. Mirror the watch-once failure behavior.

build.mjs [386-430]

 if (isProduction) {
   // Ensure compiler is properly closed after production runs
   compiler.run((err, stats) => {
+    const hasErrors = !!(
+      err ||
+      (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
+    )
+    let callbackFailed = false
     const finishClose = () =>
       compiler.close((closeErr) => {
         if (closeErr) {
           console.error('Error closing compiler:', closeErr)
           process.exitCode = 1
         }
+        if (hasErrors || callbackFailed) {
+          process.exitCode = 1
+        }
       })
     try {
       const ret = callback(err, stats)
       if (ret && typeof ret.then === 'function') {
-        ret.then(finishClose, finishClose)
+        ret.then(finishClose, () => {
+          callbackFailed = true
+          finishClose()
+        })
       } else {
         finishClose()
       }
     } catch (_) {
+      callbackFailed = true
       finishClose()
     }
   })
 } else {
   const watching = compiler.watch({}, (err, stats) => {
     const hasErrors = !!(
       err ||
       (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
     )
     // Normalize callback return into a Promise to catch synchronous throws
     const ret = Promise.resolve().then(() => callback(err, stats))
     if (isWatchOnce) {
       const finalize = (callbackFailed = false) =>
         watching.close((closeErr) => {
           if (closeErr) console.error('Error closing watcher:', closeErr)
           // Exit explicitly to prevent hanging processes in CI
-          // Use non-zero exit code when errors occurred, including callback failures
           const shouldFail = hasErrors || closeErr || callbackFailed
           process.exit(shouldFail ? 1 : 0)
         })
       ret.then(
         () => finalize(false),
         () => finalize(true),
       )
     }
   })
 }
  • [ ] Apply / Chat <!-- /improve --apply_suggestion=0 -->
Suggestion importance[1-10]: 8

__

Why: This is a critical bug fix for CI/CD, as it ensures that a failed production build correctly returns a non-zero exit code, preventing broken builds from being treated as successful.

Medium
Fix boolean env parsing

This helper treats any non-empty string as true, so BUILD_WATCH_ONCE="" or an
unknown value becomes truthy, which can unexpectedly exit watchers in CI. Align
its behavior with isDisabledEnv to only enable on explicit truthy tokens. This
prevents accidental activation from empty or malformed env values.

build.mjs [18-22]

 function parseBooleanEnv(val) {
   if (val == null) return false
   const s = String(val).trim().toLowerCase()
-  return !(s === '' || s === '0' || s === 'false' || s === 'no' || s === 'off')
+  if (s === '' || s === '0' || s === 'false' || s === 'no' || s === 'off' || s === 'null' || s === 'undefined') return false
+  if (s === '1' || s === 'true' || s === 'yes' || s === 'on') return true
+  return false
 }
  • [ ] Apply / Chat <!-- /improve --apply_suggestion=1 -->
Suggestion importance[1-10]: 5

__

Why: The suggestion correctly identifies that parseBooleanEnv is too lenient, which could cause unexpected behavior; the proposed change makes environment variable parsing more robust.

Low
Incremental [*]
Harden cache invalidation keys

Include process.version and the major versions of critical loaders/minifiers in
the cache version to prevent stale cache reuse across Node or plugin upgrades
that may change output semantics

build.mjs [149-165]

-const dirKey = path.basename(sourceBuildDir || outdir)
-const variantParts = [
-  isWithoutKatex ? 'no-katex' : 'with-katex',
-  isWithoutTiktoken ? 'no-tiktoken' : 'with-tiktoken',
-  minimal ? 'minimal' : 'full',
-  dirKey,
-  isProduction ? 'prod' : 'dev',
-]
-const variantId = variantParts.join('__')
+const toolVersions = {
+  node: process.version,
+  webpack: webpack.version,
+  loaders: {
+    babelLoader: (() => {
+      try { return require('babel-loader/package.json').version } catch { return 'unknown' }
+    })(),
+    cssMinimizer: (() => {
+      try { return require('css-minimizer-webpack-plugin/package.json').version } catch { return 'unknown' }
+    })(),
+    esbuild: (() => {
+      try { return require('esbuild/package.json').version } catch { return 'unknown' }
+    })(),
+    esbuildLoader: (() => {
+      try { return require('esbuild-loader/package.json').version } catch { return 'unknown' }
+    })(),
+    miniCssExtract: (() => {
+      try { return require('mini-css-extract-plugin/package.json').version } catch { return 'unknown' }
+    })(),
+  },
+}
 ...
 cache: {
   type: 'filesystem',
   name: `webpack-${variantId}`,
-  // Only include dimensions that affect module outputs to avoid
-  // unnecessary cache invalidations across machines/CI runners
-  version: JSON.stringify({ PROD: isProduction }),
-  // default none; override via BUILD_CACHE_COMPRESSION=gzip|brotli
+  version: JSON.stringify({ PROD: isProduction, tools: toolVersions }),
   compression: cacheCompressionOption ?? false,
   buildDependencies: {
     config: [
       path.resolve('build.mjs'),
       ...['package.json', 'package-lock.json']
         .map((p) => path.resolve(p))
         .filter((p) => fs.existsSync(p)),
     ],
   },
 },

[To ensure code accuracy, apply this suggestion manually]

Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies that the Webpack cache key should include versions of key build tools to prevent stale cache issues after dependency upgrades, which is a valuable improvement for cache robustness.

Medium
Stabilize Sass builds and warnings

Pass sassOptions: { silentDeps: true } to avoid excessive rebuilds when Sass
includes from node_modules change timestamps and to reduce noisy warnings,
stabilizing dev watch and CI determinism

build.mjs [284-288]

 {
   loader: 'sass-loader',
-  options: { implementation: sassImpl },
+  options: {
+    implementation: sassImpl,
+    sassOptions: {
+      silentDeps: true,
+    },
+  },
 },
  • [ ] Apply / Chat <!-- /improve --apply_suggestion=3 -->
Suggestion importance[1-10]: 6

__

Why: This suggestion correctly proposes using sassOptions: { silentDeps: true } to suppress warnings from third-party Sass files, which improves the developer experience by reducing log noise during builds.

Low
General
Validate worker count strictly

Number.isInteger(raw) is always true for parseInt, but parseInt('1e2', 10)
returns 1 and hides invalid inputs. Reject non-decimal numeric formats to avoid
surprising caps or overuse. Add a strict regex check so only plain positive
integers are accepted.

build.mjs [48-65]

-const cacheCompressionOption = parseCacheCompressionOption(process.env.BUILD_CACHE_COMPRESSION)
-let cpuCount = 1
-try {
-  // os.cpus() returns an array in Node.js; guard with try/catch for portability
-  cpuCount = Math.max(1, os.cpus().length || 1)
-} catch {
-  cpuCount = 1
-}
 function parseThreadWorkerCount(envValue, cpuCount) {
   const maxWorkers = Math.max(1, cpuCount)
   if (envValue !== undefined && envValue !== null) {
-    const raw = parseInt(envValue, 10)
-    if (Number.isInteger(raw) && raw > 0) {
+    const str = String(envValue).trim()
+    if (/^[1-9]\d*$/.test(str)) {
+      const raw = Number(str)
       if (raw > maxWorkers) {
-        console.warn(
-          `[build] BUILD_THREAD_WORKERS=${raw} exceeds CPU count (${cpuCount}); capping to ${maxWorkers}`,
-        )
+        console.warn(`[build] BUILD_THREAD_WORKERS=${raw} exceeds CPU count (${cpuCount}); capping to ${maxWorkers}`)
       }
       return Math.min(raw, maxWorkers)
     }
-    console.warn(
-      `[build] Invalid BUILD_THREAD_WORKERS="${envValue}", defaulting to CPU count (${cpuCount})`,
-    )
+    console.warn(`[build] Invalid BUILD_THREAD_WORKERS="${envValue}", defaulting to CPU count (${cpuCount})`)
   }
   return maxWorkers
 }

[To ensure code accuracy, apply this suggestion manually]

Suggestion importance[1-10]: 5

__

Why: The suggestion correctly identifies a weakness in using parseInt for validation and provides a more robust regex-based solution, preventing unexpected behavior from malformed environment variables.

Low
  • [ ] Update <!-- /improve_multi --more_suggestions=true -->

Previous suggestions

✅ Suggestions up to commit 6c89d29
CategorySuggestion                                                                                                                                    Impact
General
Prevent unnecessary cache busting
Suggestion Impact:The commit removed THREAD and WORKERS (and other fields) from the cache version and limited it to { PROD: isProduction }, aligning with the suggestion to avoid unstable values in the version.

code diff:

-      version: JSON.stringify({
-        THREAD: enableThread,
-        WORKERS: threadWorkers,
-        CACHE_COMP: cacheCompressionOption ?? false,
-        PROD: isProduction,
-      }),
+      // Only include dimensions that affect module outputs to avoid
+      // unnecessary cache invalidations across machines/CI runners
+      version: JSON.stringify({ PROD: isProduction }),

Avoid putting unstable values (like worker count) into the cache version, or it
will constantly invalidate caches across machines/CI runners. Limit the cache
version to configuration that truly affects module outputs.

build.mjs [138-157]

 const compiler = webpack({
   ...
   devtool: isProduction ? false : 'cheap-module-source-map',
   cache: {
     type: 'filesystem',
     name: `webpack-${variantId}`,
     version: JSON.stringify({
       THREAD: enableThread,
-      WORKERS: threadWorkers,
       CACHE_COMP: cacheCompressionOption ?? false,
       PROD: isProduction,
     }),
-    // default none; override via BUILD_CACHE_COMPRESSION=gzip|brotli
     compression: cacheCompressionOption ?? false,
     buildDependencies: {
       config: [
         path.resolve('build.mjs'),
         ...['package.json', 'package-lock.json']
           .map((p) => path.resolve(p))
           .filter((p) => fs.existsSync(p)),
       ],
     },
   },
   ...
 })
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly points out that including the threadWorkers count, which depends on the machine's CPU cores, in the cache version will reduce cache hits across different environments like local development and CI.

Medium
Incremental [*]
Make symlink resolution configurable

Add a conditional to re-enable resolve.symlinks when an explicit environment
flag is set for linked/workspace development to avoid module duplication and
resolution bugs when using npm link/pnpm during local development.

build.mjs [204-211]

 resolve: {
   extensions: ['.jsx', '.mjs', '.js'],
-  // Disable symlink resolution for consistent behavior/perf; note this can
-  // affect `npm link`/pnpm workspaces during local development
-  symlinks: false,
+  // Disable symlink resolution by default; allow opt-in for linked/workspace dev
+  symlinks: process.env.BUILD_RESOLVE_SYMLINKS === '1',
   alias: {
     parse5: path.resolve(__dirname, 'node_modules/parse5'),
     ...(minimal
Suggestion importance[1-10]: 6

__

Why: The suggestion improves developer experience by adding a configurable flag for symlinks, which the PR's own comments identify as a potential issue for local development with linked packages.

Low
Possible issue
Fix env flag parsing
Suggestion Impact:The commit added a parseBooleanEnv function and used it to compute isWatchOnce, normalizing values like "0", "false", "no", "off", and empty string to false, addressing the fragile truthiness check.

code diff:

+function parseBooleanEnv(val) {
+  if (val == null) return false
+  const s = String(val).trim().toLowerCase()
+  return !(s === '' || s === '0' || s === 'false' || s === 'no' || s === 'off')
+}
+const isWatchOnce = parseBooleanEnv(process.env.BUILD_WATCH_ONCE)

Parse boolean-like env flags explicitly; the current truthiness check treats any
non-empty string (including "0" or "false") as true. This can unintentionally
enable watch-once mode. Normalize values like "0", "false", and "" to false.

build.mjs [16-18]

 const isAnalyzing = process.argv[2] === '--analyze'
 const parallelBuild = process.env.BUILD_PARALLEL !== '0'
-const isWatchOnce = !!process.env.BUILD_WATCH_ONCE
+const isWatchOnce = (() => {
+  const v = process.env.BUILD_WATCH_ONCE
+  if (v == null) return false
+  const s = String(v).trim().toLowerCase()
+  return !(s === '' || s === '0' || s === 'false' || s === 'no' || s === 'off')
+})()

[Suggestion processed]

Suggestion importance[1-10]: 6

__

Why: The suggestion correctly identifies that !!process.env.BUILD_WATCH_ONCE is a fragile way to parse a boolean environment variable, leading to incorrect behavior if a user sets it to "0" or "false".

Low
✅ Suggestions up to commit abfc14d
CategorySuggestion                                                                                                                                    Impact
Incremental [*]
Use external source maps in dev

Align devtool with external map files by switching to external source maps in
development to avoid large inlined maps and ensure the .map files actually exist
for copying.

build.mjs [137-522]

-devtool: isProduction ? false : 'cheap-module-source-map',
+devtool: isProduction ? false : 'source-map',
 ...
-// Dev-only: copy external source maps for CSP-safe debugging
 ...(isProduction
   ? []
   : [
       { src: `${sourceBuildDir}/shared.js.map`, dst: 'shared.js.map' },
       { src: `${sourceBuildDir}/content-script.js.map`, dst: 'content-script.js.map' },
       { src: `${sourceBuildDir}/background.js.map`, dst: 'background.js.map' },
       { src: `${sourceBuildDir}/popup.js.map`, dst: 'popup.js.map' },
       { src: `${sourceBuildDir}/IndependentPanel.js.map`, dst: 'IndependentPanel.js.map' },
     ]),
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies that cheap-module-source-map inlines source maps, which contradicts the goal of copying external .map files for CSP-safe debugging in development.

Medium
Possible issue
Resolve zip on finish or close
Suggestion Impact:The commit added handling for the 'close' event and refactored resolution via a shared done function, ensuring the Promise resolves when the stream closes. It also modified the 'finish' handler to trigger a close, addressing the robustness concern.

code diff:

+    const done = () => {
+      if (!settled) {
+        settled = true
+        resolve()
+      }
+    }
     output.once('error', fail)
     archive.once('error', fail)
     archive.on('warning', (err) => {
       // Log non-fatal archive warnings for diagnostics
       console.warn('[build][zip] warning:', err)
     })
+    // Resolve on close to ensure FD is flushed and closed
+    output.once('close', done)
+    // Ensure close is emitted after finish on some fast runners
     output.once('finish', () => {
-      if (!settled) {
-        settled = true
-        resolve()
+      try {
+        if (typeof output.close === 'function') output.close()
+      } catch (_) {
+        // ignore
       }
     })

If archiver emits 'close' without 'finish' in some Node streams, the promise may
never resolve. Listen to both 'finish' and 'close' on the output stream to
guarantee completion across environments, avoiding stuck builds in CI.

build.mjs [425-452]

 const zipPath = `${dir}.zip`
 await fs.ensureDir(path.dirname(zipPath))
 await new Promise((resolve, reject) => {
   const output = fs.createWriteStream(zipPath)
   const archive = archiver('zip', { zlib: { level: 9 } })
   let settled = false
+  const settleOk = () => {
+    if (!settled) {
+      settled = true
+      resolve()
+    }
+  }
   const fail = (err) => {
     if (!settled) {
       settled = true
       reject(err)
     }
   }
   output.once('error', fail)
+  output.once('finish', settleOk)
+  output.once('close', settleOk)
   archive.once('error', fail)
   archive.on('warning', (err) => {
-    // Log non-fatal archive warnings for diagnostics
     console.warn('[build][zip] warning:', err)
-  })
-  output.once('finish', () => {
-    if (!settled) {
-      settled = true
-      resolve()
-    }
   })
   archive.pipe(output)
   archive.directory(dir, false)
   archive.finalize()
 })
Suggestion importance[1-10]: 7

__

Why: This is a good suggestion for improving robustness. While the finish event on a write stream is generally sufficient, adding a handler for close provides a fallback to prevent hung builds in edge cases, which is valuable for CI stability.

Medium
General
Reduce default thread oversubscription

Spawning threads equal to total CPUs can oversubscribe when Webpack already
parallelizes minifiers and loaders, causing slowdowns or OOM in CI. Cap default
workers to CPU-1 (min 1) unless explicitly overridden to reduce contention.

build.mjs [55]

+function parseThreadWorkerCount(envValue, cpuCount) {
+  const hardMax = Math.max(1, cpuCount - 1) // leave one core for the main thread/system
+  if (envValue !== undefined && envValue !== null) {
+    const raw = parseInt(envValue, 10)
+    if (Number.isInteger(raw) && raw > 0) {
+      if (raw > cpuCount) {
+        console.warn(
+          `[build] BUILD_THREAD_WORKERS=${raw} exceeds CPU count (${cpuCount}); capping to ${cpuCount}`,
+        )
+      }
+      return Math.min(raw, cpuCount)
+    }
+    console.warn(
+      `[build] Invalid BUILD_THREAD_WORKERS="${envValue}", defaulting to ${hardMax}`,
+    )
+  }
+  return hardMax
+}
 const threadWorkers = parseThreadWorkerCount(process.env.BUILD_THREAD_WORKERS, cpuCount)
Suggestion importance[1-10]: 6

__

Why: The suggestion correctly identifies a potential performance issue with thread oversubscription and proposes a reasonable heuristic (CPU-1) to mitigate it, improving build performance on multi-core machines without explicit configuration.

Low
✅ Suggestions up to commit b5f27c3
CategorySuggestion                                                                                                                                    Impact
General
Wait for stream close for ZIP
Suggestion Impact:The commit added resolving on the 'close' event and adjusted the 'finish' handler to trigger close, ensuring the promise resolves after the stream is closed, aligning with the suggestion's intent.

code diff:

+    const done = () => {
+      if (!settled) {
+        settled = true
+        resolve()
+      }
+    }
     output.once('error', fail)
     archive.once('error', fail)
     archive.on('warning', (err) => {
       // Log non-fatal archive warnings for diagnostics
       console.warn('[build][zip] warning:', err)
     })
+    // Resolve on close to ensure FD is flushed and closed
+    output.once('close', done)
+    // Ensure close is emitted after finish on some fast runners
     output.once('finish', () => {
-      if (!settled) {
-        settled = true
-        resolve()
+      try {
+        if (typeof output.close === 'function') output.close()
+      } catch (_) {
+        // ignore
       }
     })

Listen for the close event on the output stream to ensure the file descriptor is
flushed and closed before proceeding. This avoids intermittent truncated ZIPs on
fast CI runners.

build.mjs [431-459]

 async function zipFolder(dir) {
   const zipPath = `${dir}.zip`
   await fs.ensureDir(path.dirname(zipPath))
   await new Promise((resolve, reject) => {
     const output = fs.createWriteStream(zipPath)
     const archive = archiver('zip', { zlib: { level: 9 } })
     let settled = false
-    const fail = (err) => {
+    const settle = (err) => {
       if (!settled) {
         settled = true
-        reject(err)
+        err ? reject(err) : resolve()
       }
     }
+    const fail = (err) => settle(err)
     output.once('error', fail)
     archive.once('error', fail)
     archive.on('warning', (err) => {
-      // Log non-fatal archive warnings for diagnostics
       console.warn('[build][zip] warning:', err)
     })
-    output.once('finish', () => {
-      if (!settled) {
-        settled = true
-        resolve()
-      }
-    })
+    // finish fires when all data is flushed to the OS, but close ensures fd closed
+    output.once('close', () => settle())
     archive.pipe(output)
     archive.directory(dir, false)
     archive.finalize()
   })
 }
Suggestion importance[1-10]: 8

__

Why: The suggestion correctly identifies a potential race condition where the promise could resolve before the file stream is fully closed, and proposes listening to the close event, which is a critical fix for ensuring ZIP file integrity.

Medium
Possible issue
Cap worker pool below CPU count

Cap worker count to at least 1 and at most CPU count minus one to avoid starving
the main thread or oversubscribing small machines. This prevents build hangs or
thrashing on low-core CI runners.

build.mjs [43-62]

 const rawWorkers = process.env.BUILD_THREAD_WORKERS
   ? parseInt(process.env.BUILD_THREAD_WORKERS, 10)
   : undefined
 let threadWorkers
+const maxUsable = Math.max(1, cpuCount - 1) // keep one core for main thread/OS
 if (Number.isInteger(rawWorkers) && rawWorkers > 0) {
-  const maxWorkers = Math.max(1, cpuCount)
-  if (rawWorkers > maxWorkers) {
+  if (rawWorkers > maxUsable) {
     console.warn(
-      `[build] BUILD_THREAD_WORKERS=${rawWorkers} exceeds CPU count (${cpuCount}); capping to ${maxWorkers}`,
+      `[build] BUILD_THREAD_WORKERS=${rawWorkers} exceeds usable cores (${maxUsable}); capping to ${maxUsable}`,
     )
   }
-  threadWorkers = Math.min(rawWorkers, maxWorkers)
+  threadWorkers = Math.min(rawWorkers, maxUsable)
 } else {
   if (process.env.BUILD_THREAD_WORKERS) {
     console.warn(
-      `[build] Invalid BUILD_THREAD_WORKERS="${process.env.BUILD_THREAD_WORKERS}", defaulting to CPU count (${cpuCount})`,
+      `[build] Invalid BUILD_THREAD_WORKERS="${process.env.BUILD_THREAD_WORKERS}", defaulting to ${maxUsable}`,
     )
   }
-  threadWorkers = cpuCount
+  threadWorkers = maxUsable
 }
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies a potential performance issue where using all CPU cores for workers can starve the main thread, and proposes a standard practice of reserving one core, which improves build stability.

Medium
✅ Suggestions up to commit 0fdeae8
CategorySuggestion                                                                                                                                    Impact
Incremental [*]
Fix CI cache ordering

Move the Webpack cache restore step before installing dependencies so that cache
hits are not invalidated by files created during npm ci, and add cache: 'npm' to
actions/setup-node to ensure package caching is reliable and keyed to the
lockfile and Node version

.github/workflows/pre-release-build.yml [24-46]

+- uses: actions/checkout@v5
 - uses: actions/setup-node@v4
   with:
     node-version: 20
+    cache: 'npm'
+    cache-dependency-path: '**/package-lock.json'
 - name: Detect Node major version
   run: echo "NODE_MAJOR=$(node -p 'process.versions.node.split(".")[0]')" >> $GITHUB_ENV
-- name: Cache npm cache
-  uses: actions/cache@v4
-  with:
-    path: ~/.npm
-    key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
-    restore-keys: |
-      ${{ runner.os }}-node-
-- run: npm ci
 - name: Cache Webpack filesystem cache
   uses: actions/cache@v4
   with:
     path: |
       .cache/webpack
       node_modules/.cache/webpack
     key: ${{ runner.os }}-node${{ env.NODE_MAJOR }}-webpack-${{ hashFiles('**/package-lock.json') }}
     restore-keys: |
       ${{ runner.os }}-node${{ env.NODE_MAJOR }}-webpack-
+- run: npm ci
 - run: npm run build
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies that using the built-in cache: 'npm' for the setup-node action is best practice and that the Webpack cache should be restored before npm ci, improving the CI workflow's reliability and correctness.

Medium
Avoid copy failures for missing assets
Suggestion Impact:While it didn’t implement the exact file-existence filter before copying, the commit addressed the same problem by adding ensureDevCssPlaceholders to create placeholder CSS and sourcemap files in development, preventing copy failures and 404s due to missing optional assets.

code diff:

+// In development, create placeholder CSS and sourcemap files to avoid 404 noise
+async function ensureDevCssPlaceholders(targetDir) {
+  if (isProduction) return
+  const cssFiles = [path.join(targetDir, 'popup.css'), path.join(targetDir, 'content-script.css')]
+  for (const cssPath of cssFiles) {
+    if (!(await fs.pathExists(cssPath))) {
+      await fs.outputFile(cssPath, '/* dev placeholder */\n')
+    }
+    const mapPath = `${cssPath}.map`
+    if (!(await fs.pathExists(mapPath))) {
+      await fs.outputFile(mapPath, '{"version":3,"sources":[],"mappings":"","names":[]}')
+    }
+  }
+}
+
 async function finishOutput(outputDirSuffix, sourceBuildDir = outdir) {
   const commonFiles = [
     { src: 'src/logo.png', dst: 'logo.png' },
@@ -520,22 +528,7 @@
     [...commonFiles, { src: 'src/manifest.json', dst: 'manifest.json' }],
     chromiumOutputDir,
   )
-  // In development, ensure placeholder CSS and CSS sourcemap files exist to avoid 404 noise
-  if (!isProduction) {
-    const chromiumCssPlaceholders = [
-      path.join(chromiumOutputDir, 'popup.css'),
-      path.join(chromiumOutputDir, 'content-script.css'),
-    ]
-    for (const p of chromiumCssPlaceholders) {
-      if (!(await fs.pathExists(p))) {
-        await fs.outputFile(p, '/* dev placeholder */\n')
-      }
-      const mapPath = `${p}.map`
-      if (!(await fs.pathExists(mapPath))) {
-        await fs.outputFile(mapPath, '{"version":3,"sources":[],"mappings":"","names":[]}')
-      }
-    }
-  }
+  await ensureDevCssPlaceholders(chromiumOutputDir)
   if (isProduction) await zipFolder(chromiumOutputDir)
 
   // firefox
@@ -544,22 +537,7 @@
     [...commonFiles, { src: 'src/manifest.v2.json', dst: 'manifest.json' }],
     firefoxOutputDir,
   )
-  // In development, ensure placeholder CSS and CSS sourcemap files exist to avoid 404 noise
-  if (!isProduction) {
-    const firefoxCssPlaceholders = [
-      path.join(firefoxOutputDir, 'popup.css'),
-      path.join(firefoxOutputDir, 'content-script.css'),
-    ]
-    for (const p of firefoxCssPlaceholders) {
-      if (!(await fs.pathExists(p))) {
-        await fs.outputFile(p, '/* dev placeholder */\n')
-      }
-      const mapPath = `${p}.map`
-      if (!(await fs.pathExists(mapPath))) {
-        await fs.outputFile(mapPath, '{"version":3,"sources":[],"mappings":"","names":[]}')
-      }
-    }
-  }
+  await ensureDevCssPlaceholders(firefoxOutputDir)
   if (isProduction) await zipFolder(firefoxOutputDir)
 }

Guard optional assets before copying by filtering commonFiles to entries that
exist to avoid hard failures when a chunk or map is not emitted (e.g.,
tree-shaken CSS or watch-first run), and log a warning instead of throwing for
missing dev maps

build.mjs [468-523]

 async function finishOutput(outputDirSuffix, sourceBuildDir = outdir) {
-  const commonFiles = [
+  const maybeFiles = [
     { src: 'src/logo.png', dst: 'logo.png' },
     { src: 'src/rules.json', dst: 'rules.json' },
-
     { src: `${sourceBuildDir}/shared.js`, dst: 'shared.js' },
-    { src: `${sourceBuildDir}/content-script.css`, dst: 'content-script.css' }, // shared
-
+    { src: `${sourceBuildDir}/content-script.css', dst: 'content-script.css' },
     { src: `${sourceBuildDir}/content-script.js`, dst: 'content-script.js' },
-
     { src: `${sourceBuildDir}/background.js`, dst: 'background.js' },
-
     { src: `${sourceBuildDir}/popup.js`, dst: 'popup.js' },
     { src: `${sourceBuildDir}/popup.css`, dst: 'popup.css' },
     { src: 'src/popup/index.html', dst: 'popup.html' },
-
     { src: `${sourceBuildDir}/IndependentPanel.js`, dst: 'IndependentPanel.js' },
     { src: 'src/pages/IndependentPanel/index.html', dst: 'IndependentPanel.html' },
     ...(isProduction
       ? []
       : [
           { src: `${sourceBuildDir}/shared.js.map`, dst: 'shared.js.map' },
           { src: `${sourceBuildDir}/content-script.js.map`, dst: 'content-script.js.map' },
           { src: `${sourceBuildDir}/background.js.map`, dst: 'background.js.map' },
           { src: `${sourceBuildDir}/popup.js.map`, dst: 'popup.js.map' },
           { src: `${sourceBuildDir}/IndependentPanel.js.map`, dst: 'IndependentPanel.js.map' },
         ]),
   ]
+  const commonFiles = []
+  for (const f of maybeFiles) {
+    if (await fs.pathExists(f.src)) {
+      commonFiles.push(f)
+    } else if (!isProduction) {
+      console.warn('[build] optional asset missing (dev):', f.src)
+    }
+  }
   ...
   await copyFiles(
     [...commonFiles, { src: 'src/manifest.json', dst: 'manifest.json' }],
     chromiumOutputDir,
   )
   ...
   await copyFiles(
     [...commonFiles, { src: 'src/manifest.v2.json', dst: 'manifest.json' }],
     firefoxOutputDir,
   )
+}
Suggestion importance[1-10]: 6

__

Why: The suggestion correctly points out that the PR's implementation in copyFiles would cause the build to fail if an optional development file like a source map is missing, and proposes a more robust solution by checking for file existence first.

Low
Possible issue
Fail clearly when Sass missing
Suggestion Impact:The commit added a nested try/catch: it first tries to import 'sass-embedded', then falls back to 'sass', and if both fail, it throws a clear error message indicating no Sass implementation is available.

code diff:

+  } catch (e1) {
+    try {
+      const mod = await import('sass')
+      sassImpl = mod.default || mod
+    } catch (e2) {
+      throw new Error(`No Sass implementation available. Install 'sass-embedded' or 'sass'.`)
+    }

Guard against absence of both Sass implementations to prevent runtime crashes.
Throw a clear error message if neither module can be resolved so failures are
actionable.

build.mjs [97-104]

-const sassImpl
+let sassImpl
 try {
   const mod = await import('sass-embedded')
   sassImpl = mod.default || mod
-} catch (e) {
-  const mod = await import('sass')
-  sassImpl = mod.default || mod
+} catch (_) {
+  try {
+    const mod = await import('sass')
+    sassImpl = mod.default || mod
+  } catch (err) {
+    throw new Error(
+      `No Sass implementation available. Install 'sass-embedded' or 'sass'. Original error: ${err && err.message}`,
+    )
+  }
 }
Suggestion importance[1-10]: 7

__

Why: This suggestion improves the build script's robustness by adding explicit error handling for when neither sass-embedded nor sass is available, providing a clear and actionable error message.

Medium
General
Leave one CPU core free

Cap the default worker count to CPU count minus one to avoid exhausting all
cores, which can cause system thrashing in CI and on laptops. Preserve explicit
overrides, but when not set, prefer leaving one core free.

build.mjs [43-62]

 const rawWorkers = process.env.BUILD_THREAD_WORKERS
   ? parseInt(process.env.BUILD_THREAD_WORKERS, 10)
   : undefined
 let threadWorkers
 if (Number.isInteger(rawWorkers) && rawWorkers > 0) {
   const maxWorkers = Math.max(1, cpuCount)
   if (rawWorkers > maxWorkers) {
     console.warn(
       `[build] BUILD_THREAD_WORKERS=${rawWorkers} exceeds CPU count (${cpuCount}); capping to ${maxWorkers}`,
     )
   }
   threadWorkers = Math.min(rawWorkers, maxWorkers)
 } else {
   if (process.env.BUILD_THREAD_WORKERS) {
     console.warn(
       `[build] Invalid BUILD_THREAD_WORKERS="${process.env.BUILD_THREAD_WORKERS}", defaulting to CPU count (${cpuCount})`,
     )
   }
-  threadWorkers = cpuCount
+  // leave one core free by default
+  threadWorkers = Math.max(1, cpuCount - 1)
 }
Suggestion importance[1-10]: 6

__

Why: The suggestion improves system stability by default by leaving one CPU core free for other tasks, which is a sensible performance optimization.

Low
✅ Suggestions up to commit d9cc0eb
CategorySuggestion                                                                                                                                    Impact
Incremental [*]
Add Node version to cache key

Ensure the cache key incorporates the Node major version so caches don't get
mixed across Node versions, which can cause inconsistent or corrupted caches

.github/workflows/pre-release-build.yml [27-35]

 - name: Detect Node major version
-  run: echo "NODE_MAJOR=$(node -p 'process.versions.node.split(".")[0]')" >> $GITHUB_ENV
+  run: echo "NODE_MAJOR=$(node -p 'process.versions.node.split(\".\")[0]')" >> $GITHUB_ENV
 - name: Cache npm cache
   uses: actions/cache@v4
   with:
     path: ~/.npm
-    key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
+    key: ${{ runner.os }}-node${{ env.NODE_MAJOR }}-${{ hashFiles('**/package-lock.json') }}
     restore-keys: |
-      ${{ runner.os }}-node-
+      ${{ runner.os }}-node${{ env.NODE_MAJOR }}-
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies that the npm cache key is missing the Node.js version, which could lead to cache corruption, and aligns it with the webpack cache key that already includes it.

Medium
Prevent null stats access

Guard against a null stats object before calling hasErrors to prevent runtime
crashes on initial watch errors

build.mjs [393-410]

 const watching = compiler.watch({}, (err, stats) => {
-  const hasErrors = !!(
-    err ||
-    (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
-  )
+  const hasStats = !!stats && typeof stats.hasErrors === 'function'
+  const hasErrors = !!(err || (hasStats && stats.hasErrors()))
   const ret = callback(err, stats)
   if (process.env.BUILD_WATCH_ONCE) {
     const finalize = () =>
       watching.close((closeErr) => {
         if (closeErr) console.error('Error closing watcher:', closeErr)
-        // Exit explicitly to prevent hanging processes in CI
-        // Use non-zero exit code when errors occurred
         process.exit(hasErrors || closeErr ? 1 : 0)
       })
     if (ret && typeof ret.then === 'function') ret.then(finalize, finalize)
     else finalize()
   }
 })
Suggestion importance[1-10]: 6

__

Why: This is a valid defensive programming suggestion that prevents a potential runtime error by checking if the stats object is null before accessing its methods, improving the build script's robustness.

Low
Provide CSS sourcemap placeholders
Suggestion Impact:The commit added creation of .map files for CSS placeholders in both Chromium and Firefox development outputs, matching the suggestion to prevent 404s for CSS sourcemaps.

code diff:

-  // In development, ensure placeholder CSS files exist to avoid 404 noise
+  // In development, ensure placeholder CSS and CSS sourcemap files exist to avoid 404 noise
   if (!isProduction) {
     const chromiumCssPlaceholders = [
       path.join(chromiumOutputDir, 'popup.css'),
@@ -508,6 +530,10 @@
       if (!(await fs.pathExists(p))) {
         await fs.outputFile(p, '/* dev placeholder */\n')
       }
+      const mapPath = `${p}.map`
+      if (!(await fs.pathExists(mapPath))) {
+        await fs.outputFile(mapPath, '{"version":3,"sources":[],"mappings":"","names":[]}')
+      }
     }
   }
   if (isProduction) await zipFolder(chromiumOutputDir)
@@ -518,7 +544,7 @@
     [...commonFiles, { src: 'src/manifest.v2.json', dst: 'manifest.json' }],
     firefoxOutputDir,
   )
-  // In development, ensure placeholder CSS files exist to avoid 404 noise
+  // In development, ensure placeholder CSS and CSS sourcemap files exist to avoid 404 noise
   if (!isProduction) {
     const firefoxCssPlaceholders = [
       path.join(firefoxOutputDir, 'popup.css'),
@@ -527,6 +553,10 @@
     for (const p of firefoxCssPlaceholders) {
       if (!(await fs.pathExists(p))) {
         await fs.outputFile(p, '/* dev placeholder */\n')
+      }
+      const mapPath = `${p}.map`
+      if (!(await fs.pathExists(mapPath))) {
+        await fs.outputFile(mapPath, '{"version":3,"sources":[],"mappings":"","names":[]}')
       }

Since devtool produces external source maps, also copy or generate placeholder
.map files for CSS assets in development to avoid 404s when the browser requests
*.css.map

build.mjs [496-506]

 devtool: isProduction ? false : 'cheap-module-source-map',
 ...
 await copyFiles(
   [...commonFiles, { src: 'src/manifest.json', dst: 'manifest.json' }],
   chromiumOutputDir,
 )
-// In development, ensure placeholder CSS files exist to avoid 404 noise
+// In development, ensure placeholder CSS and CSS sourcemap files exist to avoid 404 noise
 if (!isProduction) {
   const chromiumCssPlaceholders = [
     path.join(chromiumOutputDir, 'popup.css'),
     path.join(chromiumOutputDir, 'content-script.css'),
   ]
   for (const p of chromiumCssPlaceholders) {
     if (!(await fs.pathExists(p))) {
       await fs.outputFile(p, '/* dev placeholder */\n')
     }
+    const mapPath = `${p}.map`
+    if (!(await fs.pathExists(mapPath))) {
+      await fs.outputFile(mapPath, '{"version":3,"sources":[],"mappings":"","names":[]}')
+    }
   }
 }
Suggestion importance[1-10]: 5

__

Why: The suggestion correctly points out that missing CSS sourcemaps will cause 404 errors in development, and proposes a valid fix that complements the existing placeholder logic for CSS files.

Low
Possible issue
Prevent unsafe thread-loader usage

Avoid using thread-loader together with cacheDirectory-enabled babel-loader, as
it often causes hangs or degraded performance with small/fast transforms. Gate
threading to production only, and disable threads when BUILD_WATCH_ONCE or
!isProduction. This prevents CI/dev flakiness and reduces startup overhead.

build.mjs [69-261]

-const enableThread = process.env.BUILD_THREAD === '0' ? false : true
+const enableThread =
+  process.env.BUILD_THREAD === '0'
+    ? false
+    : isProduction && !isWatchOnce // only thread in sustained prod builds
 ...
 {
   test: /\.m?jsx?$/,
   exclude: /(node_modules|bower_components)/,
   resolve: {
     fullySpecified: false,
   },
   use: [
     ...(enableThread
       ? [
           {
             loader: 'thread-loader',
             options: {
               workers: threadWorkers,
-              // Ensure one-off dev build exits quickly
-              poolTimeout: isProduction
-                ? PRODUCTION_POOL_TIMEOUT_MS
-                : isWatchOnce
-                ? 0
-                : Infinity,
+              poolTimeout: PRODUCTION_POOL_TIMEOUT_MS,
             },
           },
         ]
       : []),
     {
       loader: 'babel-loader',
       options: {
         cacheDirectory: true,
         cacheCompression: false,
         presets: ['@babel/preset-env'],
         plugins: [
           ['@babel/plugin-transform-runtime'],
           [
             '@babel/plugin-transform-react-jsx',
             {
               runtime: 'automatic',
               importSource: 'preact',
             },
           ],
         ],
       },
     },
   ],
 },
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies a potential stability issue with using thread-loader and babel-loader's cacheDirectory together, which can cause hangs. Restricting thread-loader to production builds is a good practice that improves development stability and startup time.

Medium
✅ Suggestions up to commit 7bad746
CategorySuggestion                                                                                                                                    Impact
Possible issue
Exit non-zero on prod failures
Suggestion Impact:The commit adjusted production/build error handling to affect exit codes: it sets process.exitCode=1 when closing the compiler fails (lines 49-58) and ensures the top-level build() promise rejection exits with code 1 (lines 82-86). While it doesn’t exactly mirror the suggested finalize/hasErrors logic within compiler.run, it moves toward propagating non-zero exits on failures.

code diff:

@@ -377,7 +381,10 @@
     compiler.run((err, stats) => {
       const finishClose = () =>
         compiler.close((closeErr) => {
-          if (closeErr) console.error('Error closing compiler:', closeErr)
+          if (closeErr) {
+            console.error('Error closing compiler:', closeErr)
+            process.exitCode = 1
+          }
         })

In production, errors are logged but the process may still exit with code 0.
Propagate non-zero exit codes on build failure to avoid green CI with broken
artifacts. Mirror the dev watch-once behavior by exiting based on errors.

build.mjs [374-411]

 if (isProduction) {
-  // Ensure compiler is properly closed after production runs
   compiler.run((err, stats) => {
-    const finishClose = () =>
+    const hasErrors =
+      !!err || (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
+    const finishClose = (cb) =>
       compiler.close((closeErr) => {
         if (closeErr) console.error('Error closing compiler:', closeErr)
+        if (cb) cb(closeErr)
       })
     try {
       const ret = callback(err, stats)
+      const finalize = () =>
+        finishClose(() => {
+          if (hasErrors) process.exit(1)
+        })
       if (ret && typeof ret.then === 'function') {
-        ret.then(finishClose, finishClose)
+        ret.then(finalize, finalize)
       } else {
-        finishClose()
+        finalize()
       }
     } catch (_) {
-      finishClose()
+      finishClose(() => process.exit(1))
     }
   })
 } else {
   const watching = compiler.watch({}, (err, stats) => {
     const hasErrors = !!(
       err ||
       (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
     )
     const ret = callback(err, stats)
     if (process.env.BUILD_WATCH_ONCE) {
       const finalize = () =>
         watching.close((closeErr) => {
           if (closeErr) console.error('Error closing watcher:', closeErr)
-          // Exit explicitly to prevent hanging processes in CI
-          // Use non-zero exit code when errors occurred
           process.exit(hasErrors || closeErr ? 1 : 0)
         })
       if (ret && typeof ret.then === 'function') ret.then(finalize, finalize)
       else finalize()
     }
   })
 }
Suggestion importance[1-10]: 8

__

Why: This is a critical suggestion that correctly points out that production build failures would not propagate a non-zero exit code, potentially causing CI to pass with broken artifacts. The proposed change correctly mirrors the error handling logic from the development build, which is a significant improvement for CI reliability.

Medium
Add robust Sass fallback error
Suggestion Impact:The commit updated the Sass import logic to try 'sass-embedded' first, fall back to 'sass', and throw a clear error if both fail. This implements the suggested robust fallback error, though with a simpler error message and slightly different handling of the 'sass' import default.

code diff:

-  // Use the default export from sass-embedded; sass-loader expects the implementation object.
-  // Fallback to `sass` if sass-embedded is unavailable.
   let sassImpl
   try {
     const mod = await import('sass-embedded')
     sassImpl = mod.default || mod
-  } catch (e) {
-    const mod = await import('sass')
-    // Prefer namespace import style to avoid deprecated default import warning
-    // "import sass from 'sass'" is deprecated; use namespace instead
-    sassImpl = mod
+  } catch (e1) {
+    try {
+      const mod = await import('sass')
+      sassImpl = mod.default || mod
+    } catch (e2) {
+      throw new Error(`No Sass implementation available. Install 'sass-embedded' or 'sass'.`)
+    }
   }

If both sass-embedded and sass are unavailable, the build will crash with an
unhelpful dynamic import error. Add a clear fallback error that instructs how to
install one of the implementations. This prevents opaque failures in CI and
local envs.

build.mjs [93-102]

-const sassImpl
+let sassImpl
 try {
   const mod = await import('sass-embedded')
   sassImpl = mod.default || mod
-} catch (e) {
-  const mod = await import('sass')
-  // Prefer namespace import style to avoid deprecated default import warning
-  // "import sass from 'sass'" is deprecated; use namespace instead
-  sassImpl = mod
+} catch (e1) {
+  try {
+    const mod = await import('sass')
+    sassImpl = mod
+  } catch (e2) {
+    throw new Error(
+      '[build] No Sass implementation found. Install "sass-embedded" (preferred) or "sass". Original errors: ' +
+        (e1 && e1.message ? e1.message : String(e1)) +
+        ' | ' +
+        (e2 && e2.message ? e2.message : String(e2)),
+    )
+  }
 }
Suggestion importance[1-10]: 6

__

Why: This suggestion correctly identifies a potential unhandled error if both sass-embedded and sass imports fail, and proposes a robust solution to provide a clearer error message, which improves the build script's reliability.

Low
General
Add build config to cache key

The cache key ignores webpack config changes, risking stale caches and wrong
builds. Include a hash of build.mjs to bust cache when build settings change.
This prevents subtle CI-only build errors.

.github/workflows/pre-release-build.yml [37-45]

 - name: Cache Webpack filesystem cache
   uses: actions/cache@v4
   with:
     path: |
       .cache/webpack
       node_modules/.cache/webpack
-    key: ${{ runner.os }}-node${{ env.NODE_MAJOR }}-webpack-${{ hashFiles('**/package-lock.json') }}
+    key: ${{ runner.os }}-node${{ env.NODE_MAJOR }}-webpack-${{ hashFiles('**/package-lock.json', 'build.mjs') }}
     restore-keys: |
       ${{ runner.os }}-node${{ env.NODE_MAJOR }}-webpack-
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies that the Webpack cache key in the CI workflow is incomplete because it doesn't account for changes in build.mjs. Adding the file hash to the key is a valid and important improvement to prevent using a stale cache.

Medium
Incremental [*]
Avoid watcher edge-case crash
Suggestion Impact:The commit changed the watch-once condition to use isWatchOnce instead of reading the env var directly, aligning with the suggestion. However, it did not modify the hasErrors calculation. Impact is partial.

code diff:

         (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
       )
       const ret = callback(err, stats)
-      if (process.env.BUILD_WATCH_ONCE) {
+      if (isWatchOnce) {
         const finalize = (callbackFailed = false) =>
           watching.close((closeErr) => {
             if (closeErr) console.error('Error closing watcher:', closeErr)

Use the already computed isWatchOnce flag consistently and short‑circuit when
stats can be undefined to avoid TypeError on stats.hasErrors() in rare watcher
edge cases

build.mjs [393-410]

 const watching = compiler.watch({}, (err, stats) => {
-  const hasErrors = !!(
-    err ||
-    (stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
-  )
+  const hasErrors =
+    !!err ||
+    !!(stats && typeof stats.hasErrors === 'function' && stats.hasErrors())
   const ret = callback(err, stats)
-  if (process.env.BUILD_WATCH_ONCE) {
+  if (isWatchOnce) {
     const finalize = () =>
       watching.close((closeErr) => {
         if (closeErr) console.error('Error closing watcher:', closeErr)
-        // Exit explicitly to prevent hanging processes in CI
-        // Use non-zero exit code when errors occurred
         process.exit(hasErrors || closeErr ? 1 : 0)
       })
     if (ret && typeof ret.then === 'function') ret.then(finalize, finalize)
     else finalize()
   }
 })
Suggestion importance[1-10]: 6

__

Why: The suggestion improves code consistency by using the isWatchOnce variable and slightly enhances robustness by guarding against a potential TypeError if stats is null, making the watch-once logic cleaner and safer.

Low
Possible issue
Prevent truncated ZIP archives
Suggestion Impact:The commit changed the stream listener from 'close' to 'finish' and simplified finalize handling, aligning with the suggestion to resolve only after data is flushed and to call archive.finalize().

code diff:

     const onError = (err) => reject(err)
     output.on('error', onError)
     archive.on('error', onError)
-    output.on('close', resolve)
+    output.on('finish', resolve)
     archive.pipe(output)
     archive.directory(dir, false)
-    try {
-      archive.finalize()
-    } catch (err) {
-      onError(err)
-    }
+    archive.finalize()
   })

Ensure the promise resolves only after the archive stream finishes writing.
Listening to close on the output may miss buffered writes; use
output.on('finish', ...) and also wait for archive.finalize() resolution. This
prevents truncated ZIPs in CI under heavy I/O.

build.mjs [414-428]

 async function zipFolder(dir) {
   const zipPath = `${dir}.zip`
   await fs.ensureDir(path.dirname(zipPath))
   await new Promise((resolve, reject) => {
     const output = fs.createWriteStream(zipPath)
     const archive = archiver('zip', { zlib: { level: 9 } })
     const onError = (err) => reject(err)
     output.on('error', onError)
     archive.on('error', onError)
-    output.on('close', resolve)
+    // 'finish' indicates all data has been flushed to the underlying system
+    output.on('finish', resolve)
     archive.pipe(output)
     archive.directory(dir, false)
     archive.finalize().catch(onError)
   })
 }
Suggestion importance[1-10]: 8

__

Why: This is a critical bug fix; using the close event on a write stream can lead to truncated files, and the suggestion correctly proposes using the finish event to ensure all data is written before resolving the pro...

qodo-code-review[bot] avatar Aug 30 '25 20:08 qodo-code-review[bot]

@codex review

PeterDaveHello avatar Aug 30 '25 20:08 PeterDaveHello

@coderabbitai review

PeterDaveHello avatar Aug 31 '25 07:08 PeterDaveHello

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

coderabbitai[bot] avatar Aug 31 '25 07:08 coderabbitai[bot]

@codex review

PeterDaveHello avatar Aug 31 '25 07:08 PeterDaveHello

@codex review

PeterDaveHello avatar Aug 31 '25 07:08 PeterDaveHello

@codex review

PeterDaveHello avatar Aug 31 '25 11:08 PeterDaveHello

@codex review @coderabbitai review

PeterDaveHello avatar Aug 31 '25 12:08 PeterDaveHello

@PeterDaveHello I'll trigger a review of the build pipeline refactor changes in PR #886. This comprehensive update introduces esbuild-based optimization, thread-loader parallelization, and various performance enhancements.

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

coderabbitai[bot] avatar Aug 31 '25 12:08 coderabbitai[bot]

@codex review

PeterDaveHello avatar Aug 31 '25 13:08 PeterDaveHello

Codex Review: Didn't find any major issues. Keep them coming!

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

@codex review

PeterDaveHello avatar Aug 31 '25 16:08 PeterDaveHello

@codex review

PeterDaveHello avatar Aug 31 '25 16:08 PeterDaveHello

@codex review

PeterDaveHello avatar Sep 01 '25 17:09 PeterDaveHello

Codex Review: Didn't find any major issues. Keep it up!

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

@codex review

PeterDaveHello avatar Sep 02 '25 05:09 PeterDaveHello

Codex Review: Didn't find any major issues. Chef's kiss.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

@codex review

PeterDaveHello avatar Sep 06 '25 17:09 PeterDaveHello

Codex Review: Didn't find any major issues. Chef's kiss.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

/gemini review

PeterDaveHello avatar Sep 06 '25 18:09 PeterDaveHello

@codex review

PeterDaveHello avatar Sep 06 '25 19:09 PeterDaveHello

/gemini review

PeterDaveHello avatar Sep 06 '25 19:09 PeterDaveHello

Codex Review: Didn't find any major issues. Nice work!

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

/gemini review @codex review

PeterDaveHello avatar Sep 07 '25 06:09 PeterDaveHello

Hello @PeterDaveHello, I noticed you've invoked /gemini review. As an AI assistant, I'm designed to provide detailed feedback and suggestions within comments on specific code changes or answer questions you might have about the pull request. I don't execute commands like /gemini review to initiate a full, new review cycle.

However, I've already reviewed this pull request, and I can offer further insights or address any specific questions you have regarding the build pipeline refactor, esbuild integration, threading, or caching improvements. Feel free to ask!

gemini-code-assist[bot] avatar Sep 07 '25 06:09 gemini-code-assist[bot]

/gemini review @codex review

PeterDaveHello avatar Sep 07 '25 06:09 PeterDaveHello

/gemini review

PeterDaveHello avatar Sep 07 '25 06:09 PeterDaveHello

/gemini review

PeterDaveHello avatar Sep 07 '25 10:09 PeterDaveHello