Perspectives-Server
Perspectives-Server copied to clipboard
Speed up scan times
Automatically blacklisting sites that fail repeated scans may have security implications. But perhaps there are other things we could do to speed up scan times?
- Use a ThreadPool instead of continually spawning threads? How large should the pool be?
- Only scan one or some subdomains instead of all subdomains for a site? For example, if we scan deviantart.com do we really need to scan any artist.deviantart.com subdomains? Or scan a random sampling each time?