crawl4ai
crawl4ai copied to clipboard
fix: prevent memory leak by closing unused context
Summary
When scraping many URLs continuously, browser contexts accumulate in memory and are never cleaned up. The existing cleanup mechanism only runs when browsers go idle, which never happens under continuous load. This causes memory to grow unbounded until the process crashes or becomes unresponsive.
Fixes #943
Small note: I'm not used to python, I won't lie, Claude helped me a bit here, but I've checked what it did and tested it. So this is not just yet another AI slop :)
List of files changed and why
browser_manager.py: Add _context_refcounts tracking, cleanup_contexts(), and release_context() methodsasync_crawler_strategy.py: Release context ref in finally block after crawldeploy/docker/api.py: Trigger context cleanup after each request
How Has This Been Tested?
This has been tested locally by running the following script and comparing the before/after memory usage with both the master version and the patched version through a docker compose.
The script simply perform 100 scrape with 8 concurrency and report the status code repartition: https://gist.github.com/Martichou/27555055d130d1c65f6a8457fbeb2a22
Result of the test:
Unpatched version:
Baseline memory usage: 4.5%
End of first test run using unpatched version: 23.4%
End of second test run using unpatched version: 27.6%
End of third test run using unpatched version: 32.8%
Patched version:
Baseline memory usage: 5.7%
End of first test run using unpatched version: 11.2%
End of second test run using unpatched version: 12.3%
End of third test run using unpatched version: 13.4%
It may not have eliminated every leaks (1% gains between run for unknown reason), but closing the browser using the kill browser endpoint make the memory go back to 10%.
Checklist:
- [x] My code follows the style guidelines of this project
- [x] I have performed a self-review of my own code
- [x] I have commented my code, particularly in hard-to-understand areas
- [ ] I have made corresponding changes to the documentation
- [ ] I have added/updated unit tests that prove my fix is effective or that my feature works
- [ ] New and existing unit tests pass locally with my changes