What's a LRU Cache?
A LRU cache, or least recently used cache, is an abstract data structure that stores items and removes the least recently used item—that is, the item that has not been accessed for the longest time once the structure has reached its maximum capacity. This structure provides fast access to frequently used data while discarding seldom used data. This structure differs from other caching mechanisms in its least recently used (LRU) eviction policy. Other common eviction policies are: most recently used (MRU) and least frequently used (LFU). Each of which have their appropriate use cases depending on usage patterns of the data.
Why use a LRU Cache?
LRU caches are very useful in systems where memory usage is a concern but frequent data access is required. The LRU eviction policy makes these caches particularly suitable to scenarios in which we need to hold onto frequently used data without allowing data which was once heavily used to stay around long after its usage has decreased. For example, CDNs use a LRU cache to store frequently accessed web content on the network edge. Similarly, browser's use a LRU cache to store frequently accessed web content such as pages, images, and JS files on the client's machine.
Implementation
A LRU cache can be effectively implemented with the composition of a doubly-linked list and a hash map. The linked list keeps track of the order of usage of items, and the hash map provides fast look-up times by mapping keys to their respective nodes of the linked list.
For setup, let's create our linked list and a lookup map. We will use a POJO in place of a map for simplicty. We could create a separate LinkedList
class, but I have decided to manage the linked list's state within our LRUCache's implementation. Another design decision I've made is storing KVPair
s within our nodes. Instead of this, we could use a reverse lookup map which maps nodes to their keys.
For operations, we will support the get and put operations. These are relativley simple. They require some management of our map and linked list. Fortunately for us, we mostly just need to deal with the head of the linked list—meaning that we don't have to dive too far into pointer-management hell.
Operations:
For the get operation, we need perform the following steps:
- Look up the node containing the item with the lookup map
- If it exists, then move this node to the front of the LRU linked list and return its contained value. Otherwise, do nothing.
For the put operation, we need to do the following:
- Look up the node in the lookup map
- If it exists, update the value that the node contains. Otherwise, create a new node and insert it into the lookup map
- Move the new (or, existing node) to the front of the linked list
- Finally, enforce our cache's LRU eviction policy by removing the last recently used item only if the size exceeds the defined capacity
class LRUCache<K, V= any> {
private capacity: number;
private size: number;
private head: LLNode<KVPair<K, V>> | null;
private tail: LLNode<KVPair<K, V>> | null;
private map: Record<string, LLNode<KVPair<K, V>>>;
constructor(capacity: number) {
this.map = {};
this.capacity = capacity;
this.size = 0;
this.head = this.tail = null;
console.log(this)
}
get(key: K): V | undefined {
const node = this.map[String(key)];
if (node) {
this.detach(node);
this.prepend(node);
}
return node?.value?.value;
}
put(key: K, value: V): void {
let node = this.map[String(key)];
if (!node) {
node = new LLNode({ key, value });
this.map[String(key)] = node;
this.size++;
} else {
node.value.value = value;
this.detach(node);
}
this.prepend(node);
this.enforceEvictionPolicy();
}
private enforceEvictionPolicy(): void {
if (this.size > this.capacity) {
const currTail = this.tail;
this.detach(currTail!);
delete this.map[String(currTail!.value.key)];
this.size--;
}
}
private detach(node: LLNode<any>): void {
if (node.prev) {
node.prev.next = node.next;
}
if (node.next) {
node.next.prev = node.prev;
}
if (node === this.head) {
this.head = node.next;
}
if (node === this.tail) {
this.tail = node.prev;
}
node.prev = node.next = null;
}
private prepend(node: LLNode<any>): void {
const currHead = this.head;
if (currHead) {
currHead.prev = node;
}
if (!this.tail) {
this.tail = node;
}
node.next = currHead;
this.head = node;
}
}
class LLNode<T= any> {
value: T;
prev: LLNode<T> | null;
next: LLNode<T> | null;
constructor(value: T) {
this.value = value;
this.prev = null;
this.next = null;
}
}
type KVPair<K, V> = { key: K, value: V };