d2mt icon indicating copy to clipboard operation
d2mt copied to clipboard

Cache json response user side

Open MasterOdin opened this issue 11 years ago • 4 comments

The json files will only change so often (and I will set it up such that they're essentially started on the hour. Therefore, we can cache the json responses locally till the next time we can actually get an update. This would drastically lower the usage on the server by users.

MasterOdin avatar Oct 05 '14 20:10 MasterOdin

cache the stuff on the user side? the user requests dotaprj not joindota. Dotaprj should slow down on the requests not the other way around.

either that, or we take down joindota.. i mean we're giving them free fucking traffic

wololodev avatar Oct 23 '14 07:10 wololodev

This would be to decrease the load on dotaprj as the response will be the same in a 15 minute window. So, let's say I open d2mt to get a stream. I open it again in 5 minutes to get another stream and again at 10 minutes. I've now made 15 requests to dotaprj, 10 of which are duplicate. This gives a total of ~225kb total data transferred

By caching the responses the first time, there would be only the 5 requests (~75kb) which when compounded over a lot of users, would see potentially drastic drop in total load on our end (especially during big tournaments with multiple streams and things).

MasterOdin avatar Oct 23 '14 12:10 MasterOdin

ok gotchya, makes sense but the json arent live requests to API's they're presaved into our server, so it's isnt doing that much load to us ;). When you visit a page, you dou about 50 requests, for scripts, css, images, fonts, etc.. here you are just doing 4 tiny json requests

this isnt worth changing the architecture unless we bring in background processing to a whole new level like notifications and such

wololodev avatar Apr 28 '15 21:04 wololodev

When you visit a page, most modern browsers try and bring up cached results if you've just been browsing the page to save time/memory in loading.

But this isn't for the sake of the users, but to save us on requests as there are three APIs currently called (DD2, GG, and Streams) which equal ~0.046mb per call which doesn't change. Assuming users might open the extension multiple times in a 15 minute period, they're using additional data on our servers we wouldn't necessarily have to worry about.

The architecture change shouldn't be too major either as it'd basically just be something like this:

if (cacheExpired(localStorage.ggCacheTime) || empty(localStorage.ggCache)) {
    // get current data as right now
    localStorage.ggCache = JSON.stringify('{"recent": ' + recent + ', "finished": ' + finished + '}');
}
else {
    j = JSON.parse(localStorage.ggCache);
    $('#tbody_ggUpMatches').html(j['recent']);
    $('#tbody_ggReMatches').html(j['finished']);
}

The biggest thing required is adding some way of knowing when a cache would expire (could probably add in code to get current time, add 15 to it, and then save this in the JSON that gets passed to the user).

MasterOdin avatar Apr 28 '15 22:04 MasterOdin