etl
etl copied to clipboard
Feature request: LRU cache
All in the title. Considering the great assortment of utility containers and algorithms in ETL, it would be nice to have some cache of fixed size that automatically delets old entries.
Or, better yet, extend the framework to caches with different eviction policies.
I actually started looking at cache containers several years ago, but never got further than an interface (etl::icache) and a simple LRU cache example at the time. Maybe it's time to look at it again. 👍
Thanks for prompt response!
Upon looking at icache.h I've realized that we are talking about different caches. What I mean is a generic data container that is basically a map of fixed size that deletes oldest (oldest read / first inserted, whatever) values when inserting new value over capacity (API of such container is basically a get/put/contains set of methods). icache.h seems to be a caching IO system that caches data from some backend via write_store/read_store.
Examples of a generic memory cache can be seen in
- Android LRUCache / Java LinkedHashMap
- python's functools or cachetools
Most implementations contain a map that stores insertion order (e.g. via a double-linked list)
When there is a general purpose in-memory cache, it can be (almost trivially) used to build an IO cache that wraps a backend.
Good point