cached
cached copied to clipboard
Feature suggestion: Auto-refresh when remaining TTL is below under certain threshold
Use case: Imagine your hot call always takes around 1-2s and you want to cache it with a TTL of 60s. As you never want your calling functions to wait 1-2s the cache should auto-refresh when the remaining TTL drops below 10s.
Such a feature would make the lib complete for me. Please bare with me if that's already possible. I'm happy to hear about any hint for implementing that on top of the lib. Even I might not have the resources to do so.
That's an interesting case. I've just released some changes to support this in 0.27.0
. Unfortunately, it's not something simple enough that I think it should handled completely by the #[cached]
macro since refreshing the value would involve blocking the current thread, or knowing how not to block the current thread (new thread / async task).
Instead of adding that complexity to the macro itself, the macro now generates an additional function *_prime_cache
to let you forcefully update the cache yourself. See https://github.com/jaemk/cached/blob/8295ad2761a17af41f50a50d4e964bb4ca5b4081/tests/cached.rs#L1094-L1128
So to get the functionality you're looking for you can manually run a thread of logic with your preferred concurrency mechanism (threads or async) to continuously update the cache for a specific set of arguments or for a #[once]
singleton like this example from the readme https://github.com/jaemk/cached/blob/8295ad2761a17af41f50a50d4e964bb4ca5b4081/src/lib.rs#L283-L294
The one "downside" of this is that you must prime the cache using every distinct set of function arguments. If you'd like to always re-prime every key, you can do so by iterating over the cache keys directly and passing those keys as function arguments like so
use std::thread::sleep;
use std::time::Duration;
use cached::proc_macro::cached;
#[cached(key = "String", convert = r#"{ String::from(a) }"#)]
fn keyed(a: &str) -> usize {
a.len()
}
pub fn main() {
std::thread::spawn(|| {
loop {
sleep(Duration::from_secs(60));
let keys: Vec<String> = {
// note the cache keys are a tuple of all function arguments, unless it's one value
KEYED.lock().unwrap().get_store().keys().map(|k| k.clone()).collect()
};
for k in &keys {
keyed_prime_cache(k);
}
}
});
}