purple-hats
purple-hats copied to clipboard
BasicCrawler limit
Is there a way to get around this?
INFO: BasicCrawler: Crawler reached the maxRequestsPerCrawl limit of 100 requests and will shut down soon. Requests that are in progress will be allowed to finish.
INFO: BasicCrawler: Earlier, the crawler reached the maxRequestsPerCrawl limit of 100 requests and all requests that were in progress at that time have now finished. In total, the crawler processed 104 requests and will shut down.
I think this is just a sane variable from constants/constants.js
That can be adjusted here exports.maxRequestsPerCrawl = 100;
Specifying the following flag when running cli.js
allow the user to specify the maximum pages scanned:
-p, --maxpages Maximum number of pages to scan (default: 1
00). Only available in website and sitemap
scans