IndexedDB Caching Made Simple
When Your App Needs to Remember Things
Picture this: Your user is on a flaky coffee shop WiFi, frantically trying to access data from your web app. The connection drops every thirty seconds, but somehow your app just keeps working smoothly. Magic? Nope - just smart caching with IndexedDB.
If youāve ever wondered how apps like Gmail or Google Drive work seamlessly even when your internet connection resembles a broken garden hose, youāre about to find out. IndexedDB is a powerful client-side storage API that lets you store significant amounts of structured data locally, enabling your web applications to work both online and offline.
Bottom Line: IndexedDB caching transforms flaky web experiences into reliable, fast applications that work regardless of network conditions.
Why IndexedDB Beats Other Storage Options
localStorage is like that tiny apartment you had in college - it seemed fine until you tried to fit your life into 5MB. IndexedDB offers much larger storage capacity (often hundreds of megabytes or more), while localStorage typically maxes out at 5-10MB per domain.
Hereās what makes IndexedDB special for caching:
Massive Storage Capacity: IndexedDB shows a browser compatibility score of 97, and unlike localStorageās strict limits, it can store huge amounts of data. Its data storage limits are usually large, if they exist at all, but different browsers handle limits and data eviction differently.
Complex Data Types: While localStorage forces you to stringify everything, IndexedDB lets you store and retrieve objects that are indexed with a key; any objects supported by the structured clone algorithm can be stored. That means you can cache entire API responses, images, or complex objects without the JSON.stringify dance.
Asynchronous Operations: IndexedDB operations are asynchronous, so they wonāt block your main thread - because nobody likes a frozen UI.
Setting Up Your Caching Layer
Letās build a simple but effective caching system. First, weāll create a basic IndexedDB wrapper:
class CacheDB {
constructor(dbName = 'AppCache', version = 1) {
this.dbName = dbName;
this.version = version;
this.db = null;
}
async init() {
return new Promise((resolve, reject) => {
const request = indexedDB.open(this.dbName, this.version);
request.onerror = () => reject(request.error);
request.onsuccess = () => {
this.db = request.result;
resolve(this.db);
};
request.onupgradeneeded = (event) => {
const db = event.target.result;
const store = db.createObjectStore('cache', {
keyPath: 'key'
});
store.createIndex('expiry', 'expiry');
};
});
}
}
Now letās add caching methods:
async put(key, data, ttl = 3600000) { // 1 hour default
const expiry = Date.now() + ttl;
const transaction = this.db.transaction(['cache'], 'readwrite');
const store = transaction.objectStore('cache');
await store.put({ key, data, expiry, timestamp: Date.now() });
}
async get(key) {
const transaction = this.db.transaction(['cache'], 'readonly');
const store = transaction.objectStore('cache');
const result = await store.get(key);
if (!result || result.expiry < Date.now()) {
return null; // Expired or doesn't exist
}
return result.data;
}
The Stale-While-Revalidate Pattern
Hereās where the magic happens. This pattern serves cached data immediately while fetching fresh data in the background:
async function cachedFetch(url, options = {}) {
const cacheKey = `fetch_${url}`;
const cache = new CacheDB();
await cache.init();
// Try to get cached data first
const cachedData = await cache.get(cacheKey);
// Start background fetch regardless
const fetchPromise = fetch(url, options)
.then(response => response.json())
.then(data => {
// Cache the fresh data
cache.put(cacheKey, data, 300000); // 5 minutes
return data;
})
.catch(error => {
console.warn('Fetch failed:', error);
return null;
});
// Return cached data immediately if available
if (cachedData) {
return { data: cachedData, fromCache: true };
}
// Otherwise wait for the network request
const freshData = await fetchPromise;
return { data: freshData, fromCache: false };
}
Squeezing More into Less Space with Compression
Hereās where things get interesting. You know that feeling when you try to pack for a vacation and realize your suitcase is way too small? Thatās your IndexedDB without compression. Letās fix that.
Modern browsers are pretty smart - Chrome will now compress large files using the Snappy real-time compression library, resulting in significant space savings. This is especially effective for structured data, such as large arrays of JavaScript values, XML, or JSON. But we can do better with client-side compression before the data even hits IndexedDB.
LZ-String: The Speed Demon
For most caching scenarios, LZ-String is your best friend. lz-string was designed to fulfill the need of storing large amounts of data in localStorage, specifically on mobile devices, but it works brilliantly with IndexedDB too.
import LZString from 'lz-string';
async putCompressed(key, data, ttl = 3600000) {
// Compress the data before storing
const compressed = LZString.compress(JSON.stringify(data));
const expiry = Date.now() + ttl;
const transaction = this.db.transaction(['cache'], 'readwrite');
const store = transaction.objectStore('cache');
await store.put({
key,
data: compressed,
expiry,
compressed: true,
originalSize: JSON.stringify(data).length,
compressedSize: compressed.length
});
}
async getCompressed(key) {
const transaction = this.db.transaction(['cache'], 'readonly');
const store = transaction.objectStore('cache');
const result = await store.get(key);
if (!result || result.expiry < Date.now()) {
return null;
}
// Decompress if needed
if (result.compressed) {
const decompressed = LZString.decompress(result.data);
return JSON.parse(decompressed);
}
return result.data;
}
Real-world impact: Iāve seen API responses that were 800KB compress down to 200KB with LZ-String. Thatās 4x more data you can cache!
When to Compress vs When to Skip
Not everything benefits from compression. Hereās your cheat sheet:
Compress these: JSON API responses, text data, repetitive structures, large arrays Skip compression for: Already compressed images, small objects (under 1KB), frequently accessed data that needs instant retrieval
function shouldCompress(data) {
const jsonString = JSON.stringify(data);
const size = jsonString.length;
// Skip tiny data - compression overhead isn't worth it
if (size < 1000) return false;
// Skip if data looks pre-compressed (random-looking)
const entropy = calculateEntropy(jsonString);
if (entropy > 7.5) return false; // High entropy = likely compressed
return true;
}
Smart Cache Management
Nobody wants their app to become a digital hoarder. Hereās how to keep your cache healthy:
async cleanExpired() {
const transaction = this.db.transaction(['cache'], 'readwrite');
const store = transaction.objectStore('cache');
const index = store.index('expiry');
const range = IDBKeyRange.upperBound(Date.now());
const cursor = await index.openCursor(range);
let deletedCount = 0;
cursor.onsuccess = (event) => {
const cursor = event.target.result;
if (cursor) {
cursor.delete();
deletedCount++;
cursor.continue();
}
};
return deletedCount;
}
Pro tip: Reads and writes to IndexedDB shouldnāt be larger than required for the data being accessed. While IndexedDB makes it possible to store large, nested objects as a single record, this practice should be avoided. Break large objects into smaller chunks for better performance, and compress them for maximum space efficiency.
Handling the Gotchas
IndexedDB has some quirks that can trip you up:
Safariās 7-Day Deletion: After 7 days of inactivity, Safari deletes all browser storage, including IndexedDB. This means you canāt rely on IndexedDB for permanent storage - treat it as a smart cache, not a database.
Private Browsing: Some browsers donāt allow writing to IndexedDB when in private browsing mode. Always wrap your operations in try-catch blocks and have fallbacks ready.
Transaction Auto-Commit: Promise.resolve().then() can lead to transactions being closed prematurely, causing exceptions - especially in Safari and Firefox. Keep your operations synchronous within transactions.
Making It Production-Ready
For real applications, consider using a library like Dexie.js, which offers a fluent API and supports advanced querying, or the simpler idb library that adds promises to the native API. RxDB: A NoSQL client side database that can be used on top of IndexedDB. Supports indexes, compression and replication is another excellent choice for complex applications.
Compression library comparison:
- LZ-String: Fast compression, great for most use cases
- LZUTF8: Higher performance (3-14MB/s compression, 20-120MB/s decompression), optimized for UTF-8
- Built-in browser compression: Chrome will now compress these large files using the Snappy real-time compression library, resulting in significant space savings
Performance tip: Batch operations by grouping multiple read/write operations in a single transaction, use compression for larger datasets, and consider async compression with web workers for huge files to avoid blocking the main thread.
The Payoff
IndexedDB truly shines when it comes to creating rich, offline-first experiences. With the increasing focus on performance, reliability, and seamless user experiences, knowledge of IndexedDB is a significant asset for any web developer.
When implemented correctly, IndexedDB caching makes your app feel instant. Users get immediate responses from cached data while fresh content loads silently in the background. Itās like having the best of both worlds - the speed of local data and the freshness of live updates.
Next step: Start small - cache your most frequently accessed API responses with compression enabled, and gradually expand your caching strategy. Your coffee shop WiFi users will definitely notice the difference, and so will your storage quotas!