flutter_map_tile_caching
flutter_map_tile_caching copied to clipboard
[FEATURE] Throttle browsing network tile requests
Discussed in https://github.com/JaffaKetchup/flutter_map_tile_caching/discussions/163
Originally posted by eidolonFIRE July 29, 2024 My app displays many different locations at once (many FlutterMap widgets) at different LatLng locations.
Loading a page suddenly spams a tile provider for very different locations which triggers server to deny all requests. Sometimes I get blocked from tile providers for several hours.
Q: Is there some way to throttle tile fetching? Maybe limit how many concurrent requests for a tile store before they start to queue up?
Hi @eidolonFIRE,
This is not currently supported, but I'll look into implementing it in a future version (maybe v10). It should be possible to get the throttling to apply to network requests only, so requests to cache should remain 'instant'. For the time being, you can reduce the number of tile requests by changing the TileLayer.panBuffer to 0.
You may be able to use https://pub.dev/documentation/flutter_map/latest/flutter_map/TileUpdateTransformers/throttle.html.
However, this will evenly space all requests, and not allow bursts. Implementing burst rate limiting is more difficult.
Browse caching rate limiting will unfortunately not be coming in v10. I have investigated, but it seems like quite a large scale project, and I could not find anything similar in Dart at the moment. The Guava project in Java seems to have exactly what is needed, but it is difficult to understand and will need to be ported, probably to an independent library. It turns out it's a lot more complicated than initially thought!
The algorithm we need is a bursty/jittery rate limiter - a leaky bucket "as a meter", or token bucket.
See:
- https://en.wikipedia.org/wiki/Leaky_bucket#As_a_meter
- https://en.wikipedia.org/wiki/Token_bucket
- https://github.com/google/guava/blob/master/guava/src/com/google/common/util/concurrent/SmoothRateLimiter.java
- https://github.com/google/guava/blob/master/guava/src/com/google/common/util/concurrent/RateLimiter.java
A special feature we need, and I'm not sure if Guava provides, but other libraries definitely do not, is the ability to go into 'negative' tokens, and essentially start forming a bursty queue. If there are 500 requests in one second, and our limit is 200/sec, then 300 will be attempted in the next second, when in reality, we want 200 in the next second, and 100 the second after. One bit of complexity that we don't need, is that we can consider all tasks as the same size.
If anyone has any insight, it would be appreciated!