decoupled-json-content
decoupled-json-content copied to clipboard
Route invalidation?
Say we create an app, and it fetches the posts/pages. It can store it in a state (Redux, MobX, using setState or what ever one chooses), or localStorage. This is done so that we don't need to fetch the same content over and over again.
But if the content changes on the WordPress side, how do we know that we need to update our store?
I found an article about how Modern Tribe did it on their site: https://tri.be/blog/redux-react-and-the-wordpress-rest-api-v2/
Along with the route/entry pairs we store a last-visited timestamp every time your local cache is updated. The first time you land on our site we check for this timestamp and, if found, after the page has rendered we send the endpoint this timestamp and get back a list of IDs and routes for all posts updated since your last visit. Those are then scrubbed from your browser cache. First page-loads for a session always update cache, since we have the fresh data there anyways.
They are using localStorage to store their site data (not overly fond of that tbh).
How should we handle this invalidation?
It seems that the app will have to make some kind of a check call to the WordPress to see if something has changed or not.
@iruzevic @DarkoKukovec
I don't like this - it's doing to much of the work ahead of time. This means that once you return to the page there is a chance where it would redownload most of the content. I think we could combine two things - etags and service workers in case the network is spotty or the user is offline.
We can also decide for how long we would keep the existing page (e.g. doesn't really make sense to fetch a page content every few seconds for a blog post) - we could do this either by hardcoding something or using cache-control: max-age
.
All of those are reliable methods of caching that are standardised and exist in all browsers (~75% of users have service workers right now, but this number should grow to ~90% in matter of a few months, and sw part is a bonus anyway).
I was looking at my test response and I found no etags, which means that this should be set up on server (I had the Cache-Control: max-age=0
header)?
Basically, the caching and checks should be handled by front end application?
Yes, etags should be handled manually, basically something like sha1(url + updatedAt)
. Max-age should be larger than 0 if we don't want to invalidate on every call.