dynamic_sitemaps icon indicating copy to clipboard operation
dynamic_sitemaps copied to clipboard

Live generation on search engine query

Open augnustin opened this issue 11 years ago • 3 comments

There a chance I missed something, but I don't understand why it is not possible to have a rule that instead of serving /public/sitemap.xml, which can be outdated, regenerates it and serves the newest file lively (and stores it in /public/sitemap.xml at the same time). It would save us the need to have a whenever task running, and being sure the sitemap is never older than xx time.

Is it that the search engines look at the duration of the query? That would be lame as in many situation, querying /sitemap.xml doesn't follow the same process as any page generation (static content), so it should not be a SEO criteria ...

augnustin avatar Feb 17 '14 11:02 augnustin

The previous version of Dynamic Sitemaps built the sitemap dynamically on each request. However, this was unsuitable for larger sitemaps (like with 2 million pages), and it would hold up the webserver when this could be used to serve "real" requests.

Therefore the sitemap generation was moved to background processing and optimized it for many URLs.

However, I know that the dynamic sitemap feature is something a lot of people could use, especially for sites with less pages. Therefore I'm considering a solution that would allow both dynamic and "static" sitemaps, and serving the static sitemaps via an engine to require less initial setup.

I'm on it :blush: Thanks for your input.

lassebunk avatar Feb 18 '14 11:02 lassebunk

Thanks for feedback. Indeed a double system would be super nice! I realized though that as using Heroku, I will have to store the file in a different location anyway. But that could still work I guess.

augnustin avatar Feb 18 '14 13:02 augnustin

Yeah. I also got a request for adding support for S3 storage, to enable the use on Heroku.

lassebunk avatar Feb 18 '14 13:02 lassebunk