✨ Feature Request: Built-in LRU Cache Support in Bun
What is the problem this feature would solve?
Description: It would be valuable for the Bun runtime to include first-class support for an LRU (Least Recently Used) cache API or utility that developers can use directly without relying on third-party libraries.
While LRU cache implementations exist in the ecosystem, Bun currently doesn’t provide a native, performant cache abstraction with eviction policies like LRU. This can lead to inconsistent behavior between environments and reliance on external packages or workarounds.
Motivation:
- Standardizing an efficient LRU cache implementation would improve developer experience and reduce the need for external dependencies.
- Some existing cache packages (e.g.,
lru-cache) have exhibited regressions or compatibility issues in Bun in the past (e.g., TTL expiry behavior or performance differences) — see related issues such as regression inlru-cacheTTL handling and performance comparisons vs Node.js implementations. ([GitHub][1])
Proposal:
Add a native LRU cache API to Bun’s standard library (e.g., Bun.Cache.LRU or similar), with features such as:
- Configurable maximum size
- Least-recently-used eviction policy
- Optional TTL (Time to Live) support
- Optional serialization / persistence integration (if useful)
This could be exposed under a standard namespace (instead of requiring @std/cache or other third-party modules), similar to how Deno and Node provide standardized utilities.
Example API (suggested):
// Create an LRU cache with max 500 entries
const cache = new Bun.LRUCache({ max: 500 });
// Set and get values
cache.set("user:123", { name: "Alice" });
const user = cache.get("user:123");
// Check size & stats
console.log(cache.size); // number of current entries
console.log(cache.evictions); // number of items evicted so far
Use Cases:
- Caching database query results in edge functions
- HTTP server response cache layers
- Generic memoization utility without external deps
Benefits:
- Improved performance and consistency
- Reduced reliance on third-party cache packages
- Makes Bun more complete as a runtime out of the box
Additional Context / References:
- Community has reported LRU cache behavior regressions or compatibility issues with node modules in Bun in the past. ([GitHub][1])
- Performance benchmarking vs Node.js for cache eviction has shown significant differences for third-party implementations. ([GitHub][2])
Checklist (optional):
- [ ] API design proposal
- [ ] Benchmark/Performance goals
- [ ] Documentation draft
What is the feature you are proposing to solve the problem?
// Create an LRU cache with max 500 entries
const cache = new Bun.LRUCache({ max: 500 });
// Set and get values
cache.set("user:123", { name: "Alice" });
const user = cache.get("user:123");
// Check size & stats
console.log(cache.size); // number of current entries
console.log(cache.evictions); // number of items evicted so far
What alternatives have you considered?
No response
📝 CodeRabbit Plan Mode
Generate an implementation plan and prompts that you can use with your favorite coding agent.
- [ ] Create Plan
🔗 Related PRs
oven-sh/bun#23506 - Reduce # of redundant resolver syscalls #2 [merged] oven-sh/bun#24389 - Clear module cache when require'ing an es module with TLA throws [merged]
👤 Suggested Assignees
🧪 Issue enrichment is currently in open beta.
You can configure auto-planning by selecting labels in the issue_enrichment configuration.
To disable automatic issue enrichment, add the following to your .coderabbit.yaml:
issue_enrichment:
auto_enrich:
enabled: false
💬 Have feedback or questions? Drop into our discord or schedule a call!