feat(debugger): implement intelligent snapshot pruning for oversized payloads
What does this PR do?
Implements an intelligent snapshot pruning algorithm for Dynamic Instrumentation/Live Debugger that selectively removes the largest and deepest leaf nodes from oversized payloads while preserving the overall schema structure. This replaces the previous approach which simply deleted all captured variables when a snapshot exceeded the 1MB size limit.
Motivation
Align with how the other tracers prunes large snapshots.
Previously, when a snapshot payload exceeded 1MB, we would delete all captures entirely and show an error message to users. This was a poor user experience because users would lose all captured variable data, even though most of the snapshot was likely still valuable.
With this new pruning algorithm, we can intelligently reduce the size of oversized snapshots by removing only the least valuable data (deepest, largest nodes, and nodes that were already truncated due to depth limits), allowing users to still see most of their captured variables even when dealing with large data structures.
Additional Notes
The pruning algorithm:
- Parses snapshots into a tree structure tracking JSON object positions
- Uses a priority queue to select nodes for pruning based on:
- Presence of
notCapturedReason: 'depth'(highest priority - already truncated data) - Depth level (deeper nodes pruned first)
- Presence of any
notCapturedReason(any truncated data) - Size (larger nodes pruned first)
- Presence of
- Only prunes nodes at level 6 or deeper (
localsand below) - Promotes parent nodes when all children are pruned to reduce overhead
- Iteratively prunes if needed to reach target size
- Falls back to prune all everything if pruning fails or errors out
Overall package size
Self size: 4.35 MB Deduped: 5.23 MB No deduping: 5.23 MB
Dependency sizes
| name | version | self size | total size | |------|---------|-----------|------------| | import-in-the-middle | 1.15.0 | 127.66 kB | 856.24 kB | | dc-polyfill | 0.1.10 | 26.73 kB | 26.73 kB |π€ This report was automatically generated by heaviest-objects-in-the-universe
-
#7006
π (View in Graphite)
-
master
This stack of pull requests is managed by Graphite. Learn more about stacking.
Codecov Report
:white_check_mark: All modified and coverable lines are covered by tests.
:white_check_mark: Project coverage is 84.76%. Comparing base (86b4f71) to head (2a18882).
:warning: Report is 2 commits behind head on master.
Additional details and impacted files
@@ Coverage Diff @@
## master #7006 +/- ##
==========================================
- Coverage 84.77% 84.76% -0.01%
==========================================
Files 521 521
Lines 22149 22151 +2
==========================================
Hits 18776 18776
- Misses 3373 3375 +2
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
- :package: JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.
Benchmarks
Benchmark execution time: 2025-12-17 12:38:44
Comparing candidate commit 2a18882e7ba2ce03d0ae050117c076e6f49986f4 in PR branch watson/DEBUG-2624/implement-pruning-algo with baseline commit 86b4f716ce3f5fcf36d8e0a6e79a2104c80c55a5 in branch master.
Found 0 performance improvements and 0 performance regressions! Performance is the same for 291 metrics, 29 unstable metrics.
β Β Tests
π All green!
βοΈ No new flaky tests detected
π§ͺ All tests passed
π Commit SHA: 2a18882 | Docs | Datadog PR Page | Was this helpful? Give us feedback!