loke.dev
Header image for I Thought WeakMaps Prevented Memory Leaks Until I Tried to Cache an API Response

I Thought WeakMaps Prevented Memory Leaks Until I Tried to Cache an API Response

Stop struggling with memory leaks in your custom caches by moving beyond WeakMaps and mastering the art of manual garbage collection tracking with WeakRef and FinalizationRegistry.

· 7 min read

I spent three years telling junior developers that WeakMap was the secret to memory-efficient JavaScript. I'd point to the MDN documentation, wave my hands around talking about "garbage collection," and feel confident that I was providing the ultimate solution for caching. Then I tried to build a high-performance API response cache for a real-time dashboard, and the whole mental model fell apart.

I realized that WeakMap isn't actually a "cache" in the way most of us think of one. If you’re trying to prevent memory leaks in a system where you’re mapping strings (like URLs or IDs) to large objects, a WeakMap won't just fail to help you—it literally won't let you write the code.

The WeakMap Misconception

The common myth is that WeakMap is a magical tool that automatically deletes entries when they aren't being used. That's only half true, and the "half" that is true is very specific.

In a WeakMap, the keys must be objects, and those keys are held "weakly." If there are no other references to the key object anywhere else in your program, the entire entry (key and value) is eligible for garbage collection.

But here is the catch: In 90% of API caching scenarios, your key is a string.

// This will throw a TypeError: Invalid value used as weak map key
const apiCache = new WeakMap();
const url = "https://api.example.com/data/123";

apiCache.set(url, { some: "massive_data_object" }); 

You can't use strings as keys in a WeakMap. This is a fundamental limitation. And even if you tried to wrap your URL in an object—like apiCache.set({ url }, data)—you'd still have a problem. Since you're creating a new object literal right there, you have no reference to that key elsewhere. The GC would wipe it out almost immediately, making your cache hit rate exactly 0%.

The Goal: A Cache that Cleans Itself

What we actually want when we talk about "memory-safe caching" is a system where:
1. We can use strings (URLs, IDs) as keys.
2. The values (the API responses) are held weakly.
3. If the rest of the application stops using a specific response object, the garbage collector (GC) should be able to reclaim that memory.
4. If the data is still in memory, we should be able to reuse it.

To do this, we have to look past WeakMap and reach for two newer primitives: WeakRef and FinalizationRegistry.

Enter WeakRef: The "Maybe" Reference

A WeakRef lets you hold a reference to an object without preventing it from being garbage collected.

Think of a standard variable as a "strong" leash. As long as you're holding the leash, the dog (the object) can't run away. A WeakRef is like having a photo of the dog. You can check the photo to see which dog it was, but the dog is free to leave whenever it wants.

let user = { name: "Alice", detailedBio: "..." };
const ref = new WeakRef(user);

// Later, we check if Alice is still around
const heldUser = ref.deref();
if (heldUser) {
    console.log("Cache hit!", heldUser.name);
} else {
    console.log("Cache miss: Alice was garbage collected.");
}

The .deref() method is the magic here. If the object hasn't been swept by the GC yet, you get the object back. If it has, you get undefined.

The "Dead Cell" Leak

If we simply use a standard Map to store our WeakRef objects, we've solved one problem but created another.

const cache = new Map();

function getCachedData(key) {
    const ref = cache.get(key);
    if (ref) {
        const data = ref.deref();
        if (data) return data;
    }
    return null;
}

function setCache(key, data) {
    cache.set(key, new WeakRef(data));
}

This looks good, right? The large data objects can be garbage collected. However, the WeakRef objects themselves—and the string keys in our Map—will never be removed.

If your app stays open for days and fetches 100,000 unique URLs, your Map will eventually contain 100,000 string keys and 100,000 empty WeakRef objects. Even though the big data is gone, these "dead cells" are still leaking memory. This is exactly what I encountered when my dashboard app started consuming 500MB of RAM just to store empty references.

FinalizationRegistry: The Janitor

To fix the dead cell leak, we need to know exactly *when* an object is garbage collected so we can remove the key from our Map. This is what FinalizationRegistry is for.

It allows you to register a callback that runs after an object you're watching has been reclaimed by the GC.

const registry = new FinalizationRegistry((heldKey) => {
  console.log(`Cleaning up map entry for: ${heldKey}`);
  cache.delete(heldKey);
});

function setCache(key, data) {
  cache.set(key, new WeakRef(data));
  // We tell the registry to watch 'data'. 
  // If 'data' is GC'd, it will pass the 'key' to our callback.
  registry.register(data, key);
}

Building the "Smart Cache"

Let's put this all together into a production-ready class. This implementation handles the mapping of string keys to weakly-held values while ensuring the map itself stays lean.

class SmartCache {
  #cache = new Map();
  #registry;

  constructor() {
    // The callback receives the 'heldValue' we passed during registration
    this.#registry = new FinalizationRegistry((key) => {
      this.#cleanup(key);
    });
  }

  #cleanup(key) {
    const ref = this.#cache.get(key);
    // Double check: is the reference actually empty?
    if (ref && !ref.deref()) {
      console.debug(`Memory reclaimed: Removing key "${key}" from cache.`);
      this.#cache.delete(key);
    }
  }

  set(key, value) {
    if (typeof value !== 'object' || value === null) {
      throw new Error("SmartCache only works with objects as values.");
    }

    // 1. Store the weak reference
    this.#cache.set(key, new WeakRef(value));

    // 2. Register for cleanup
    // We pass the 'key' as the held value so the registry knows what to delete
    this.#registry.register(value, key);
  }

  get(key) {
    const ref = this.#cache.get(key);
    if (!ref) return undefined;

    const value = ref.deref();
    if (!value) {
      // Manual cleanup if we happen to hit an empty ref before the registry fires
      this.#cache.delete(key);
      return undefined;
    }

    return value;
  }
}

Why this is better than a standard LRU cache

In a standard Least Recently Used (LRU) cache, you set a limit (e.g., "keep 100 items"). But what if those 100 items are tiny? You're wasting memory. What if those 100 items are 10MB each? You're crashing the browser.

The SmartCache approach lets the environment decide. If the user's computer has 32GB of RAM and the browser is feeling generous, your cache can stay large. If the user opens 50 other tabs and memory pressure increases, the GC will aggressively reclaim your cached objects, and the FinalizationRegistry will trim your Map down automatically. It’s a cache that scales with the available hardware.

The "Gotchas" (The parts they don't tell you)

While this is powerful, it's not a silver bullet. There are several non-obvious traps I fell into.

1. The Value Must Be an Object

You cannot use WeakRef on primitives (strings, numbers, booleans). This makes sense—primitives are passed by value and aren't "collected" in the same way objects are. If you need to cache a raw string, you'll have to wrap it in an object: { data: myString }.

2. Don't hold onto the reference locally

If you do this, you defeat the purpose:

const cache = new SmartCache();
const data = await fetcher();
cache.set('key', data);

// If 'data' is a global variable or stuck in a closure,
// it will NEVER be garbage collected, and SmartCache 
// acts just like a regular Map.
window.myData = data; 

The object is only eligible for collection when *nothing else* in your application is holding a reference to it.

3. Garbage Collection is non-deterministic

You cannot predict when the GC will run. In some browsers, it might run every few seconds. In others, it might wait until the tab is backgrounded. This means you can't use SmartCache for logic that requires immediate deletion. It's for performance optimization, not for state management.

4. The "Registry Lag"

The FinalizationRegistry callback doesn't run the millisecond the object is collected. It's queued. There might be a short window where ref.deref() returns undefined but the key still exists in the Map. Our get() method handles this by manually deleting the key if it discovers a dead reference.

Testing the Implementation

How do you actually test if this works? You can't just wait around for the GC. Fortunately, Chromium-based browsers (Chrome, Edge, Brave) provide a way to trigger garbage collection manually through the DevTools or via the --expose-gc flag in Node.js.

If you're using Chrome:
1. Open DevTools.
2. Go to the Memory tab.
3. Click the Trash Can icon (Collect garbage).

You can see the SmartCache in action with this snippet:

const myCache = new SmartCache();

(function() {
  // Create a large object in a local scope
  let bigData = { 
    payload: new Array(1_000_000).fill("🚀"),
    id: "heavy-request" 
  };
  
  myCache.set("api/heavy", bigData);
  console.log("Stored in cache:", myCache.get("api/heavy"));
})(); 
// bigData goes out of scope here

// If you click the Trash Can icon in DevTools now...
// The registry will fire and "api/heavy" will vanish from the map.
setTimeout(() => {
  console.log("After GC check:", myCache.get("api/heavy"));
}, 10000);

When to use this vs. WeakMap?

Use WeakMap when:
- You want to associate private data with an object you don't own. (e.g., storing metadata about a DOM element).
- The lifecycle of the value is strictly tied to the lifecycle of the key.

Use SmartCache (WeakRef + FinalizationRegistry) when:
- Your keys are strings (URLs, IDs, hash keys).
- You want to cache large objects to save on network/CPU, but you don't want to be the reason the app runs out of memory.
- You want the cache to be "elastic" based on the system's actual memory pressure.

Final Thoughts

Moving away from the "WeakMap is for caching" mindset was a turning point in how I write utility libraries. JavaScript’s memory management used to be a black box that we just hoped worked correctly. With WeakRef and FinalizationRegistry, we finally have the tools to collaborate with the garbage collector instead of fighting against it.

It takes more boilerplate than a simple new WeakMap(), but for a robust application, the visibility and control you gain over your memory footprint are worth every line of code. Stop guessing if your cache is leaking—build one that knows how to clean up after itself.