loke.dev
Header image for A Fragile Speed for the JavaScript Proxy

A Fragile Speed for the JavaScript Proxy

An investigation into why the V8 engine struggles to inline Proxy traps and the specific architectural patterns that trigger 'megamorphic' performance cliffs.

· 9 min read

I spent weeks convinced that my new reactivity engine was the fastest thing since slice() until I actually plugged it into a real-world dashboard. On paper, the logic was flawless—clean, decoupled, and using the latest JavaScript features. But in practice, the frames were dropping like flies. I couldn't understand why a simple property access like state.count was suddenly the bottleneck in my Flamegraph. I assumed it was my logic, but it turned out to be the very tool I thought was making my life easier: the Proxy object. Once I dug into how the V8 engine actually handles these wrappers, the "magic" of metaprogramming started to look a lot more like a performance tax I hadn't budgeted for.

The Allure and the Hidden Cost

We love Proxies because they feel like cheating. They allow us to intercept fundamental operations—getting, setting, defining properties—without changing the underlying object's structure. It's the backbone of modern frameworks like Vue 3 and SolidJS.

But here’s the rub: in the world of high-performance JavaScript, "transparent" is a lie. To the developer, a Proxy looks like the object it wraps. To the V8 engine (the heart of Chrome and Node.js), a Proxy is a complex, opaque wall that shatters some of its most important optimizations.

When you access a property on a standard object, V8 is incredibly good at guessing where that property lives in memory. When you introduce a Proxy, you’re telling the engine: "Stop guessing. Run this arbitrary JavaScript function instead." That shift from "memory offset lookup" to "function execution" is the first step toward the performance cliff.

How V8 Usually Wins: The Fast Path

To understand why Proxies are slow, you have to understand why regular objects are fast. V8 uses a concept called Hidden Classes (or Shapes).

If you have ten thousand objects that all have the properties { x, y }, V8 doesn't store the keys x and y on every single instance. Instead, it creates a Hidden Class that says: "Property x is at offset 0, and property y is at offset 1."

When you write a function like this:

function getX(point) {
  return point.x;
}

The first time this runs, V8 records the Hidden Class of point. The next time, it checks if the Hidden Class is the same. If it is, it bypasses the expensive lookup and jumps straight to the memory offset. This is called Inline Caching (IC). If the function is called enough times with the same shape, V8 inlines the access entirely, turning point.x into a simple machine-level instruction.

The Proxy Problem: An Inlining Dead End

Now, let's throw a Proxy into that getX function.

const handler = {
  get(target, prop) {
    return target[prop];
  }
};

const proxyPoint = new Proxy({ x: 10, y: 20 }, handler);

// Even this "transparent" access is a problem
getX(proxyPoint);

When V8 encounters the Proxy inside getX, the Inline Cache breaks. Why? Because a Proxy doesn't have a stable "shape" in the way a normal object does. The engine can't simply calculate a memory offset for .x because the get trap could, theoretically, return Math.random(), or call a remote server, or delete the object itself.

Because the get trap is a dynamic function, V8's optimizing compiler (TurboFan) often struggles to inline it. While V8 has made massive strides in "calling through" the Proxy (meaning it tries to optimize the path into the trap), it’s still significantly more work than a standard property access. You are trading a direct memory read for a context switch into a trap function.

The Megamorphic Cliff

The real trouble starts when we talk about Megamorphism.

In V8 terminology:
1. Monomorphic: A call site sees only one shape (super fast).
2. Polymorphic: A call site sees a few different shapes (fast).
3. Megamorphic: A call site sees *too many* different shapes (slow).

When a call site becomes megamorphic, V8 gives up on the specialized optimization and falls back to a generic, slow lookup. Proxies are "Megamorphism Magnets."

Imagine you have a single function processing various Proxies that wrap different types of targets:

function compute(obj) {
  return obj.value + 100;
}

const p1 = new Proxy({ value: 1 }, { get: (t, k) => t[k] });
const p2 = new Proxy({ value: 2, name: 'temp' }, { get: (t, k) => t[k] });
const p3 = new Proxy({ id: 1, value: 3 }, { get: (t, k) => t[k] });

// Calling compute() with different shapes wrapped in Proxies
compute(p1);
compute(p2);
compute(p3);

Because each Proxy might have a different target shape or a different handler, the engine’s Inline Cache for obj.value gets overwhelmed. It essentially says, "I can't keep track of all these variations," and drops into the runtime lookup. This is the "cliff." Once you fall off, your code can run 10x to 100x slower.

Practical Benchmarking: Quantifying the Tax

Let's look at a concrete example. I wrote a small script to compare basic property access across a raw object, a Proxy with a trap, and a Proxy with no traps (a "no-op" proxy).

const iterations = 100_000_000;

// 1. Raw Object
const raw = { a: 1 };
console.time('Raw Object');
let sum1 = 0;
for (let i = 0; i < iterations; i++) {
  sum1 += raw.a;
}
console.timeEnd('Raw Object');

// 2. No-op Proxy (No traps defined)
const noopProxy = new Proxy({ a: 1 }, {});
console.time('No-op Proxy');
let sum2 = 0;
for (let i = 0; i < iterations; i++) {
  sum2 += noopProxy.a;
}
console.timeEnd('No-op Proxy');

// 3. Proxy with Get Trap
const trappedProxy = new Proxy({ a: 1 }, {
  get(target, prop) {
    return target[prop];
  }
});
console.time('Trapped Proxy');
let sum3 = 0;
for (let i = 0; i < iterations; i++) {
  sum3 += trappedProxy.a;
}
console.timeEnd('Trapped Proxy');

Running this on Node 20 (V8 v11.3), the results are eye-opening:
- Raw Object: ~60ms
- No-op Proxy: ~95ms
- Trapped Proxy: ~650ms

The "trapped" version is 10 times slower than the raw object. Even the no-op Proxy, which theoretically does nothing, adds a 50% overhead because the engine still has to check if traps exist and handle the Proxy identity.

Why the Trap is Trapping You

It isn't just the fact that a function is being called. It’s the arguments that the get trap requires.

According to the ECMAScript spec, a get trap receives (target, property, receiver). To provide these, V8 has to:
1. Verify the receiver (the Proxy itself or an object inheriting from it).
2. Allocate or pass the property string.
3. Ensure the this context inside the trap is correct.

If your trap is just return target[prop], you are paying for all that setup just to do what the engine would have done anyway in a single CPU cycle.

Real-World Architectural Patterns That Trigger The Cliff

I've seen developers (including myself) fall into several specific patterns that make the Proxy overhead unbearable.

1. The Recursive Proxy (Deep Observability)

In reactivity systems, we often want to track nested objects. We do this by wrapping nested objects in Proxies as they are accessed.

function reactive(obj) {
  return new Proxy(obj, {
    get(target, key) {
      const value = Reflect.get(target, key);
      if (typeof value === 'object' && value !== null) {
        return reactive(value); // On-the-fly wrapping
      }
      return value;
    }
  });
}

This is a double-whammy. Not only do you have the get trap overhead, but you're also creating new Proxy instances inside a getter. If this happens inside a tight loop—say, rendering a list of 5,000 items—the garbage collector will start screaming, and the JIT will never find a stable path to optimize.

2. The Identity Crisis

Proxies break strictly-equal comparisons (===).

const original = { id: 1 };
const observed = new Proxy(original, {});

console.log(original === observed); // false

In large systems, this often leads to bugs where developers accidentally store both the raw object and the Proxy in a Map or Set. To fix it, they might add a "unwrap" trap or a hidden property:

get(target, key) {
  if (key === '__raw__') return target;
  return target[key];
}

Now, every single property access in your entire application has to perform a string comparison (key === '__raw__') before doing anything else. It's a tiny check, but multiplied by millions of operations, it adds up.

Mitigating the Damage

If you must use Proxies (and let's be honest, for many modern UI tasks, they are the best tool for the job), there are ways to keep the "fragile speed" from shattering.

Access the Target Directly in Loops

If you are doing heavy computation on an object that happens to be a Proxy, unwrap it *before* the loop starts.

// Slow
function processData(proxyArray) {
  for (let i = 0; i < proxyArray.length; i++) {
    doMath(proxyArray[i].value); // Trap called every iteration
  }
}

// Fast
function processData(proxyArray) {
  const raw = proxyArray.__raw__; // Hypothetical unwrap
  const len = raw.length;
  for (let i = 0; i < len; i++) {
    doMath(raw[i].value); // Back to raw speed
  }
}

Keep Handlers Static

Don't create a new handler object every time you create a Proxy. V8 can optimize better if the handler object itself has a stable shape.

// Don't do this
const p = new Proxy(target, { get(t, k) { ... } });

// Do this
const sharedHandler = {
  get(t, k) { ... }
};
const p = new Proxy(target, sharedHandler);

Use Reflect Wisely

Many people use Reflect.get(target, key, receiver) inside their traps. While Reflect is designed to work with Proxies, it has its own overhead. If you don't actually need the receiver for correct this binding (which you often don't in simple data objects), standard target[key] is marginally faster.

The V8 Perspective: Why Not Just Make It Faster?

I’ve had people ask, "Why doesn't V8 just optimize the hell out of Proxies?"

The answer is that they *are* trying. Every major V8 release usually includes some tweaks to Proxy performance. However, the very definition of a Proxy is to be a dynamic, non-deterministic boundary.

A compiler's job is to make assumptions. "I assume point.x will always be a double at offset 8." A Proxy's job is to prevent assumptions. "Actually, I might decide point.x is a string this time, or I might log a message to the console before returning it."

There's a fundamental tension between Metaprogramming Flexibility and JIT Optimization. You can have one, or you can have a bit of both, but you can rarely have the maximum of both.

The Verdict

Proxies are a superpower, but like any superpower in a resource-constrained environment, they have a cost. They are excellent for:
- State management systems (Vue, MobX).
- Validation libraries.
- Developer tooling/debugging wrappers.

They are dangerous for:
- Math-heavy code or physics engines.
- Large-scale data processing (use standard objects or TypedArrays).
- Performance-critical hot paths in your backend.

When I refactored my dashboard, I didn't get rid of Proxies entirely. I just changed *where* they lived. I stopped proxying every individual row in my data grid and instead proxied the top-level collection. I moved the heavy calculations to a "worker" function that operated on the raw, unwrapped data.

The speed of a Proxy is fragile because it depends on the engine being able to see through your traps. The moment you make your traps too complex or your shapes too varied, that visibility vanishes. Write your Proxies with the compiler's limitations in mind, and you might just keep your app from falling off the megamorphic cliff.