LRU Cache
LRU Cache
A Least Recently Used (LRU) Cache is a fixed-capacity cache that evicts the least recently used item when it is full. This is one of the most important data structures in distributed systems.
Why LRU?
Caches hold a small amount of frequently used data to avoid expensive operations (database queries, API calls, disk reads). LRU is based on temporal locality: if you accessed something recently, you are likely to access it again soon.
Implementation Strategy
The classic LRU cache is implemented with two data structures:
- HashMap — for O(1) lookups by key.
- Doubly Linked List — to track recency. Most recent at front, least recent at back.
In JavaScript, a Map preserves insertion order and provides O(1) access — making it perfect for LRU:
class LRUCache {
constructor(capacity) {
this.capacity = capacity;
this.cache = new Map();
}
get(key) {
if (!this.cache.has(key)) return -1;
const value = this.cache.get(key);
// Move to most recent: delete and re-insert
this.cache.delete(key);
this.cache.set(key, value);
return value;
}
put(key, value) {
if (this.cache.has(key)) {
this.cache.delete(key); // remove to update position
} else if (this.cache.size >= this.capacity) {
// Evict least recently used (first entry in Map)
const lruKey = this.cache.keys().next().value;
this.cache.delete(lruKey);
}
this.cache.set(key, value);
}
}
Time complexity: O(1) for both get and put.
Real-World Uses
- CPU L1/L2/L3 caches — hardware implements LRU (or approximations of it).
- Database buffer pools — MySQL and PostgreSQL use LRU to manage page caches.
- Redis — can be configured with LRU or LFU eviction policies.
- CDN edge servers — cached content is evicted LRU when storage is full.
- Browser cache — tab memory management.
Your Task
Implement an LRUCache class with get(key) (returns value or -1) and put(key, value) methods. The cache should have a fixed capacity and evict the least recently used item when full.