Amblem
Furkan Baytekin

Building an LRU Cache in JavaScript: Least Recently Used Cache

Build a fast LRU cache in JavaScript-optimize your app's memory usage

Building an LRU Cache in JavaScript: Least Recently Used Cache
109
7 minutes

Let’s chat about something that takes your caching game to the next level: the LRU cache. If you’ve ever memoized a function (like we did with those pricey calculations last time) and worried about memory bloating up with every new input, this one’s for you. LRU (Least Recently Used) caching is all about keeping the good stuff handy while kicking out the old junk when space runs tight. We’ll dig into what it is, why it’s clutch for performance, and build one from scratch in JavaScript with a real-world example. Grab your coffee, and let’s get coding!

What’s an LRU Cache ?

Imagine your desk has room for five books, but you keep grabbing new ones.Eventually, you’ve got to ditch something to make space. An LRU cache works the same way: it holds a fixed number of items and, when it’s full, tosses the least recently used one to fit the newbie.

In JavaScript land, this is perfect for scenarios where you’re caching results(say, API calls or heavy computations) but don’t want an ever-growing Map eating your RAM. It’s a performance hack with a memory cap—best of both worlds. Let’s see how to whip one up.

The Plan: A Simple LRU Cache Class

We’ll build an LRUCache class with:

For our real-world spin, imagine a mini translation app caching API results—say, English to Turkish phrases. Users type phrases, and we cache translations to skip redundant API calls, but we limit it to avoid hogging memory.

Step 1: The Basic Structure

Let’s use a Map for O(1) lookups and recency tracking(since Map preserves insertion order in modern JS):

javascript
class LRUCache { constructor(capacity) { this.capacity = capacity; // Max items allowed this.cache = new Map(); // Key-value store with order } // Get an item, move it to "most recent" get(key) { if (!this.cache.has(key)) return undefined; const value = this.cache.get(key); this.cache.delete(key); // Remove it this.cache.set(key, value); // Re-add to mark as recent return value; } // Add or update an item, evict if full put(key, value) { if (this.cache.has(key)) { this.cache.delete(key); // Remove to refresh recency } else if (this.cache.size >= this.capacity) { // Evict the least recently used (first item in Map) const oldestKey = this.cache.keys().next().value; this.cache.delete(oldestKey); } this.cache.set(key, value); // Add as most recent } } // Quick test const cache = new LRUCache(3); cache.put('hello', 'merhaba'); cache.put('goodbye', 'güle güle'); cache.put('thanks', 'teşekkürler'); console.log([...cache.cache]); // [['hello', 'merhaba'], ['goodbye', 'güle güle'], ['thanks', 'teşekkürler']] cache.put('yes', 'evet'); // Evicts 'hello' console.log([...cache.cache]); // [['goodbye', 'güle güle'], ['thanks', 'teşekkürler'], ['yes', 'evet']] console.log(cache.get('goodbye')); // 'güle güle', moves it to end console.log([...cache.cache]); // [['thanks', 'teşekkürler'], ['yes', 'evet'], ['goodbye', 'güle güle']]

How It Works

This works great—O(1) for both operations—and fits our translation cache idea. Let’s make it real.

Real-World Example: Caching Translations

Picture a little UI where users type English phrases, and we fetch Turkish translations from an API(mocked here for simplicity). Without caching, every repeat phrase hits the API. With an LRU cache, we save hits but cap memory. Here’s the setup:

javascript
// Mock API (slow, expensive) function fetchTranslation(phrase) { console.log(`Fetching translation for "${phrase}"...`); const translations = { hello: 'merhaba', goodbye: 'güle güle', thanks: 'teşekkürler', yes: 'evet', no: 'hayır' }; return new Promise(resolve => // 1s delay to simulate a slow API call setTimeout(() => resolve(translations[phrase] || '???'), 1000) ); } // Translation function with caching, using the LRUCache class we built earlier const translationCache = new LRUCache(3); async function getTranslation(phrase) { const cached = translationCache.get(phrase); if (cached) { console.log(`Cache hit for "${phrase}"`); return cached; } const translation = await fetchTranslation(phrase); translationCache.put(phrase, translation); return translation; } // Test it out (async () => { console.time('First hello'); console.log(await getTranslation('hello')); /** Fetches 'merhaba', ~1s * (We cannot say exactly 1s because of * the event loop and other factors) */ console.timeEnd('First hello'); console.time('Second hello'); console.log(await getTranslation('hello')); // Cache hit, instant console.timeEnd('Second hello'); await getTranslation('goodbye'); // Fetches 'güle güle' await getTranslation('thanks'); // Fetches 'teşekkürler' await getTranslation('yes'); // Fetches 'evet', evicts 'hello' console.log(await getTranslation('hello')); // Fetches again, ~1s console.log([...translationCache.cache]); // [['thanks', 'teşekkürler'], ['yes', 'evet'], ['hello', 'merhaba']] })();

Breaking It Down

The UI stays snappy for repeats, and memory stays capped. Perfect for a small app where users toggle between a handful of phrases.

Pros, Cons, and Edge Cases

Pros:

Cons:

Edge Cases:

Bonus: A Doubly Linked List Version ?

Map keeps it simple, but a classic LRU uses a doubly linked list + hash map for explicit recency. It’s overkill for most JS use cases (Map’s fast enough), but here’s a peek:

javascript
class Node { constructor(key, value) { this.key = key; this.value = value; this.prev = null; this.next = null; } } class LRUCacheDL { constructor(capacity) { this.capacity = capacity; this.cache = new Map(); this.head = null; // Most recent this.tail = null; // Least recent } get(key) { const node = this.cache.get(key); if (!node) return undefined; this._moveToFront(node); return node.value; } put(key, value) { let node = this.cache.get(key); if (node) { node.value = value; this._moveToFront(node); } else { node = new Node(key, value); this.cache.set(key, node); this._addToFront(node); if (this.cache.size > this.capacity) { this._removeTail(); } } } _addToFront(node) { if (!this.head) { this.head = this.tail = node; } else { node.next = this.head; this.head.prev = node; this.head = node; } } _moveToFront(node) { if (node === this.head) return; node.prev.next = node.next; if (node === this.tail) { this.tail = node.prev; } else { node.next.prev = node.prev; } this._addToFront(node); } _removeTail() { if (!this.tail) return; this.cache.delete(this.tail.key); this.tail = this.tail.prev; if (this.tail) this.tail.next = null; else this.head = null; } }

This is more “textbook” LRU—explicit list for order—but Map’s usually fine unless you’re micro-optimizing.

Wrapping Up

An LRU cache in JavaScript is your secret weapon when you need speed and sanity. Whether it’s translations, search suggestions, or game scores, it keeps your app lean and mean. Our Map-based version is simple, fast and production-ready, tweak the capacity to fit your needs, and you’re golden.


Album of the day:

Suggested Blog Posts