
The External Memory Trap: Why Your Node.js FFI Logic Is Silently Starving the Garbage Collector
A deep dive into why V8 remains oblivious to native heap allocations and how to use the AdjustAmountOfExternalAllocatedMemory API to prevent silent OOM crashes.
The External Memory Trap: Why Your Node.js FFI Logic Is Silently Starving the Garbage Collector
I’ve spent many late nights staring at a Grafana dashboard where the Node.js memory usage—at least according to the V8 heap metrics—is a flat, healthy line, yet the process keeps hitting OOM (Out of Memory) kills. It’s a ghost in the machine. You check process.memoryUsage().heapUsed and it tells you everything is fine, maybe 150MB. But the resident set size (RSS) is climbing into the gigabytes until the OS kernel loses patience and sends a SIGKILL.
This discrepancy isn't a bug in Node.js, nor is it a leak in the traditional sense. It’s a fundamental disconnect between how V8 manages its internal sandbox and how your native addons (or FFI calls) interact with the system's raw memory. We often reach for Rust, C++, or Zig to speed up Node.js, but we forget that V8 is an accountant who only tracks the cash in the register, completely oblivious to the massive debt you’re racking up in the back room.
The Illusion of the Managed Heap
When you write standard JavaScript, V8 is your nanny. You create an object, you stop using it, and eventually, the Garbage Collector (GC) sweeps it away. The GC decides when to run based on "heap pressure." If the heap is getting full, it triggers a scavenge (minor GC) or a mark-sweep (major GC).
The problem starts when you use the Foreign Function Interface (FFI) or write a C++ Addon. When you allocate memory in C++ using malloc or new, that memory lives on the system heap, not the V8 heap.
V8 has no idea that native memory exists.
Imagine you have a Node.js wrapper for an image processing library. Each time you call processImage(), the native side allocates 50MB of raw buffer space to hold pixels. On the JavaScript side, you might only hold a small handle—a "wrapper" object—that is barely a few dozen bytes.
// This looks cheap to V8, but it's expensive to the OS
const image = nativeLib.load('ultra_hd_photo.raw'); To V8, you just allocated a tiny object. It thinks, *"I have 2GB of heap and I've only used 150MB. No need to run the GC yet."* Meanwhile, your native code has allocated 1.5GB of RAM. Because V8 doesn't feel any "pressure," it doesn't trigger a GC cycle. Because it doesn't trigger a GC cycle, the JavaScript wrapper objects aren't collected. Because the wrappers aren't collected, their native destructors (which free the 50MB buffers) are never called.
You aren't leaking memory; you're just not being loud enough about how much you're using.
Seeing the Blind Spot in Action
Let's look at a practical example of this "silent starvation." Suppose we have a simple C++ addon (using Node-API/napi) that allocates a large chunk of memory.
// native_addon.cpp
#include <node_api.h>
#include <stdlib.h>
napi_value AllocateBig(napi_env env, napi_callback_info info) {
// Allocate 50MB of native memory
void* data = malloc(50 * 1024 * 1024);
// We create a "dummy" object to return to JS.
// In a real app, this might have methods to manipulate the 'data'.
napi_value wrapper;
napi_create_object(env, &wrapper);
// We attach the raw pointer and a finalizer so we can free it later.
// The finalizer only runs when 'wrapper' is GC'ed.
napi_add_finalizer(env, wrapper, data, [](napi_env env, void* finalize_data, void* finalize_hint) {
free(finalize_data);
}, NULL, NULL);
return wrapper;
}Now, let's run a loop in JavaScript that calls this repeatedly:
const addon = require('./build/Release/addon');
function run() {
for (let i = 0; i < 1000; i++) {
const handle = addon.AllocateBig();
// We 'lose' the reference immediately, so it's eligible for GC
}
console.log('Finished loop. Checking memory...');
console.log(process.memoryUsage());
}
run();If you run this, you’ll likely see the RSS (Resident Set Size) skyrocket, but the heapUsed will stay remarkably low. If you're on a machine with limited RAM, the process will crash before the for loop even finishes. V8 sees 1000 tiny objects. It doesn't see the 50GB (1000 * 50MB) of native memory associated with them.
The Solution: AdjustAmountOfExternalAllocatedMemory
To fix this, we have to talk to the accountant. We need to tell V8, "Hey, I just spent some money that isn't in your ledger."
In the V8 C++ API, this is done via Isolate::AdjustAmountOfExternalAllocatedMemory. In the more modern and stable Node-API (N-API), the function is napi_adjust_external_memory.
This function takes a int64_t value representing the change in memory. When you allocate, you pass a positive number. When you free, you pass a negative number. This tells V8's GC heuristics that the process is under more pressure than the heap suggests.
Let’s rewrite our native addon to be a better citizen:
// native_addon_fixed.cpp
#include <node_api.h>
#include <stdlib.h>
const int64_t BLOB_SIZE = 50 * 1024 * 1024;
napi_value AllocateBig(napi_env env, napi_callback_info info) {
void* data = malloc(BLOB_SIZE);
// Tell V8 we just grabbed 50MB of system RAM
int64_t adjusted_mem;
napi_adjust_external_memory(env, BLOB_SIZE, &adjusted_mem);
napi_value wrapper;
napi_create_object(env, &wrapper);
napi_add_finalizer(env, wrapper, data, [](napi_env env, void* finalize_data, void* finalize_hint) {
free(finalize_data);
// Tell V8 the memory is back in the pool
int64_t updated;
napi_adjust_external_memory(env, -BLOB_SIZE, &updated);
}, NULL, NULL);
return wrapper;
}With this change, V8 now knows that for every tiny object it creates, the system loses 50MB. As soon as the accumulated "external memory" hits a certain threshold relative to the available system RAM and heap size, V8 will trigger a GC cycle. That cycle identifies that the wrapper objects are no longer reachable, triggers the finalizers, which in turn calls free() and updates the external memory count back down.
Why FFI is Particularly Dangerous
If you are using ffi-napi or koffi instead of writing your own C++ addon, you are in even more danger. These libraries make it incredibly easy to call native functions, but they often abstract away the memory management entirely.
If you call a native function that returns a pointer to a struct, and that struct was allocated by the C library, you are responsible for freeing it. If you wrap that pointer in a Buffer or a custom object, you must ensure that you’re either:
1. Using a FinalizationRegistry in JavaScript to clean up.
2. Explicitly calling a free function.
But even if you do clean up, you still face the "starvation" problem. JavaScript's FinalizationRegistry is also triggered by the GC. If the GC doesn't run, the registry doesn't fire.
A Pure JavaScript Workaround?
If you are stuck in JS land (using FFI) and can't call napi_adjust_external_memory directly, you have to find other ways to signal pressure.
One "hacky" but effective method is to use Buffer.allocUnsafe. Buffers in Node.js are one of the few objects that *already* implement external memory reporting. When you create a large Buffer, Node.js internally calls the V8 adjustment APIs.
I’ve seen developers "pressure" the GC by allocating a large, temporary Buffer when they know they’ve just done a lot of FFI work:
function pressureGC() {
// Allocate 100MB and let it go immediately.
// This forces V8 to acknowledge that memory is moving.
Buffer.allocUnsafe(100 * 1024 * 1024);
}This is ugly. It’s a kludge. But in a high-throughput FFI environment where you don't have access to the C++ layer, it can be the difference between a stable service and a 3 AM page.
The Nuance of "External Memory"
It’s important to understand that AdjustAmountOfExternalAllocatedMemory doesn't actually *allocate* anything. It’s purely advisory. You’re just whispering in V8’s ear.
However, if you whisper too much, you can hurt performance. If you tell V8 you’ve allocated 10GB when you’ve only allocated 10MB, the GC will become hyperactive. It will spend all its time scanning the heap for objects to free, trying to alleviate pressure that doesn't actually exist. This leads to "GC thrashing," where your CPU spikes and your event loop latency goes through the roof.
Rule of thumb: Only report memory that is tied to the lifecycle of a JavaScript object. If you have a global native cache that lives forever, V8 doesn't need to know about it as much, because GCing won't help reclaim it anyway. You should manage that via your own monitoring.
Tracking the Invisible
How do you know if you're falling into this trap? You need to look at the gap.
In your monitoring (Prometheus, Datadog, etc.), track three metrics:
1. nodejs_heap_size_used_bytes
2. process_resident_set_size_bytes
3. nodejs_external_memory_bytes (Available via process.memoryUsage().external)
If external is low, but the gap between RSS and HeapUsed is massive, you have "hidden" native memory. This is your warning sign.
In modern Node.js versions, process.memoryUsage() provides a nice breakdown:
console.log(process.memoryUsage());
/*
{
rss: 125399040, // Total memory used by the process
heapTotal: 15237120, // Total V8 heap size
heapUsed: 83450 interface... // Actual JS objects
external: 50331648, // <--- This is what V8 thinks is outside the heap
arrayBuffers: 10522 // Buffers and TypedArrays
}
*/If you are writing a native addon and you aren't seeing your allocations reflected in that external number, you are setting an OOM trap for your future self.
The Finalizer Gotcha
There is a subtle race condition in native addons that often catches people off guard. When V8 decides to GC an object, it doesn't necessarily run the finalizer *immediately* on the main thread in a way that correlates with the JS execution.
In Node-API, finalizers can run while the isolate is being torn down or in various GC phases. You must ensure that your cleanup logic is thread-safe or that it doesn't rely on the JavaScript environment still being "alive" and accessible.
Also, avoid doing heavy work in a finalizer. If your finalizer calls a native library to close a database connection, perform a network request, or write to disk, you are stalling the GC process. Keep it simple: free() the memory and get out.
Summary Checklist for FFI/Addon Developers
1. Don't trust the heap: If you're using FFI, your "real" memory usage is invisible to Node's default GC triggers.
2. Report your debts: In C++, use napi_adjust_external_memory (or the equivalent in your framework) every time you malloc or free something tied to a JS object.
3. Use `Buffer` when possible: If you're passing data back and forth, Buffer objects are better at self-reporting their size to V8 than raw pointers.
4. Monitor the Gap: Watch the difference between RSS and Heap. If the gap grows while the heap stays flat, you're missing an adjustment call somewhere.
5. Test under pressure: Run your native logic in a loop with a small heap limit (node --max-old-space-size=128). If it crashes before the GC kicks in, you've found a trap.
Writing native code for Node.js is a powerful way to break through the limitations of JavaScript, but it requires giving up the luxury of total ignorance. You have to become the bridge between the managed world and the unmanaged world—and that means being a very diligent bookkeeper.


