cf-clearance-scraper
cf-clearance-scraper copied to clipboard
运行久了后,非常占用硬盘容量
首先感谢大佬的程序,非常好用 但是运行久了后,非常占用硬盘容量,我束手无策 目前只能定期删除重新安装程序,但是太麻烦了,还会影响运行 希望大佬能更新解决这个问题,谢谢!
I have also encountered this issue, which appears to be related to the puppeteer_dev_profile directories.
/var/lib/docker/overlay2/c3e07c5e8c8b3e2823917989e7bcb238e95464a671cfc4818260c8504fdb8749/merged/tmp
At present, I am unable to dedicate sufficient time to find an optimal solution. However, I have devised a workaround by implementing an automated agent that restarts the docker container on an hourly basis.
You can adjust this: https://github.com/zfcsoftware/cf-clearance-scraper/blob/main/module/browser.js#L82 to send the profile folder like this: -user-data-dir=user_data_dirs/{session_id} this way each folder/request is unique then have some handler to delete them after the disconnect: https://github.com/zfcsoftware/cf-clearance-scraper/blob/main/module/browser.js#L103 this way you will only have a couple of profiles at the same time. If you have a LOT of requests then you can group several folders and delete in chunks of 10 or so, this way the IO is reduced.
首先感谢大佬的程序,非常好用 但是运行久了后,非常占用硬盘容量,我束手无策 目前只能定期删除重新安装程序,但是太麻烦了,还会影响运行 希望大佬能更新解决这个问题,谢谢!
是什么目录会有缓存的文件,可以说一下吗?
The bookcase has been rewritten. Please test the latest version and examples by following the readme file. If you are getting errors or this issue persists, please open a new issue. https://github.com/zfcsoftware/cf-clearance-scraper/releases/tag/2.0.3