
Why Does Your Node.js Memory Usage Never Seem to Go Back Down?
Restarting your container is just a band-aid for the hidden closures and global caches slowly strangling your server’s heap.
Why Does Your Node.js Memory Usage Never Seem to Go Back Down?
Most developers treat the max-old-space-size flag like a volume knob—when the server starts lagging or crashing with "Out of Memory" errors, they just crank it up. But throwing more RAM at a Node.js process is often like buying a bigger trash can because you’re too lazy to take out the bags. The V8 engine’s garbage collector (GC) is actually a masterpiece of engineering; if it’s not reclaiming your memory, it's not because it's broken. It's because you’ve accidentally told it that you’re still using every single byte of that data.
The "Flat Line" Illusion
You’ve probably seen the graph in your monitoring tool. The memory climbs steadily, hits a plateau, and stays there. You stop sending requests to the staging server, wait ten minutes, and... nothing. The line doesn't drop.
Here is the first thing you need to accept: V8 doesn't like giving memory back to the Operating System.
Allocating memory from the OS is expensive. If Node.js managed to grab 2GB of RAM and then finished a heavy task, it won't immediately release that 2GB. It keeps it in its "Resident Set Size" (RSS) because it figures, "Hey, I might need this again in five seconds, and it’s faster if I already have it."
If you want to see if you actually have a leak, you shouldn't look at the RSS. You need to look at the Heap Used. If the Heap Used keeps climbing while the app is idle, you’ve got a problem.
The "Temporary" Cache That Isn't
We’ve all done this. We want to avoid hitting the database for the same user profile every three seconds, so we create a quick little cache.
const userCache = {};
app.get('/user/:id', async (req, res) => {
const { id } = req.params;
if (!userCache[id]) {
userCache[id] = await db.getUser(id);
}
res.json(userCache[id]);
});This looks innocent. But in a Node.js process, userCache is a global variable. It lives as long as the process lives. If you have 100,000 users, that object grows until it's holding 100,000 user objects in memory. It never expires. It never clears. You’ve just built a memory leak by hand.
The Fix: Use a proper LRU (Least Recently Used) cache with a size limit, or better yet, use Redis. If it’s in-process memory, it *must* have a ceiling.
Closures: The Silent Hoarders
Closures are one of JavaScript's best features, but they are also the most common way to accidentally hold onto massive objects.
function heavyTask() {
const massiveData = Buffer.alloc(1024 * 1024 * 50); // 50MB
return function() {
console.log("I'm just a tiny function!");
// I don't even use massiveData!
};
}
const leakyFunc = heavyTask();In the example above, leakyFunc keeps a reference to the entire lexical environment of heavyTask. Even though the returned function doesn't explicitly use massiveData, some engines (and older versions of V8) might struggle to realize that massiveData can be safely collected.
The real danger comes when these closures are attached to things that live a long time, like event listeners or timers.
Ghost Event Listeners
This is the one that usually bites me. You’re using a stream or a socket, and you attach a listener.
const setupSocket = (socket) => {
socket.on('data', (data) => {
// Process data
});
process.on('SIGUSR2', () => {
console.log('Doing some cleanup...');
// But we never removed the socket listener!
});
};Every time setupSocket is called, a new listener is added to the process object. The process object is global. It never dies. Consequently, every socket passed into that function is now trapped in memory forever because the process listener closure holds a reference to it.
The Rule: If you call .on(), you should almost always know exactly where you’re going to call .removeListener() or .off().
How do you actually find this stuff?
Stop guessing. If your memory is climbing, use the built-in node --inspect flag.
1. Start your app: node --inspect index.js
2. Open Chrome and go to chrome://inspect
3. Click "Open dedicated DevTools for Node"
4. Go to the Memory tab.
Take a Heap Snapshot. Do some work with your app. Take another snapshot. Click the "Comparison" view.
If you see (string) or (compiled code) growing by megabytes, look for the "Retainers" at the bottom. This is the "Why" section. It shows you the chain of variables keeping that object alive. If you see a chain that leads back to a Map or a Global object you didn't intend to fill, you’ve found your culprit.
Summary
Node.js memory management isn't magic, and it's not broken. Usually, "leaks" are just the engine doing exactly what you told it to do: keep a reference to data "just in case."
- Watch out for global objects/arrays used as caches.
- Be careful with closures attached to long-lived objects (like process or http.Server).
- Remember that RSS staying high is normal; HeapUsed staying high is the red flag.
Don't just restart the container. Find the reference that won't let go.


