colly
colly copied to clipboard
Use LRU cache for robots.txt map?
If colly is crawling a lot of different websites, the robots.txt map can get to be quite large. How would you feel about a PR to use an LRU cache (https://github.com/hashicorp/golang-lru) for the robots.txt map?