robots.txt icon indicating copy to clipboard operation
robots.txt copied to clipboard

Use HTTP headers to control host rule expiration

Open fooock opened this issue 5 years ago • 1 comments

As Google says in his documentation:

A robots.txt request is generally cached for up to one day, but may be cached longer in situations where refreshing the cached version is not possible (for example, due to timeouts or 5xx errors). The cached response may be shared by different crawlers. Google may increase or decrease the cache lifetime based on max-age Cache-Control HTTP headers.

See this document

fooock avatar Jun 30 '19 16:06 fooock