loke.dev
Header image for The Brotli-11 Trap

The Brotli-11 Trap

Sacrificing your Time to First Byte for a 1% gain in compression is a trade-off that modern high-performance apps can no longer afford.

· 4 min read

Run this command on a 1MB JavaScript bundle and watch your CPU fan start to scream: brotli -q 11 main.js. On a decent machine, you might wait two or three seconds. Switch that -q 11 to a -q 4 and it’s done before your finger leaves the Enter key.

In the pursuit of the "Green 100" Lighthouse score, many developers crank their compression settings to the max. We’ve been conditioned to think that smaller is always better. But Brotli level 11 is a siren song that lures you into a devastating performance trade-off: you're trading precious milliseconds of Time to First Byte (TTFB) for a byte-saving gain that is often statistically invisible to the end user.

The CPU Meat Grinder

Brotli has a compression range from 1 to 11. Levels 1 through 9 are relatively fast, using a "greedy" matching algorithm. However, once you hit level 10 and 11, the engine switches to "optimal parsing." This is an exhaustive, computationally expensive search for the best possible back-references in the data stream.

I’ve seen production servers choke because they were trying to compress dynamic JSON responses at level 11 on every single request. Here is a rough look at what that looks like in terms of cost:

| Level | Compression Time | File Size (Example) |
| :--- | :--- | :--- |
| Gzip -6 | 15ms | 320 KB |
| Brotli -4 | 22ms | 290 KB |
| Brotli -9 | 140ms | 275 KB |
| Brotli -11 | 1800ms | 272 KB |

Notice that jump from 9 to 11? You’re spending nearly two seconds of server time to save 3KB. If your user is on a 50Mbps connection, downloading those 3KB takes about 0.5 milliseconds. You just made them wait 1,799.5 milliseconds longer so you could feel good about a smaller file. That’s not optimization; that’s a tax.

Dynamic vs. Static: The Great Divide

The mistake isn't using Brotli 11—it's using it at the wrong time.

If you are compressing a response on-the-fly (dynamic content), you should almost never go above level 4 or 6. If you're using Node.js with something like the compression middleware, the default is usually fine, but people often try to get "clever" with custom configurations.

The Wrong Way (Express/Node.js)

const compression = require('compression');
const express = require('express');
const app = express();

// DO NOT DO THIS for dynamic API responses
app.use(compression({
  level: 11, // This will destroy your TTFB and CPU
  strategy: zlib.constants.Z_DEFAULT_STRATEGY 
}));

The Right Way (Nginx)

For dynamic traffic, Nginx users often find the sweet spot around level 4. It provides significantly better compression than Gzip without the CPU latency spike.

http {
    brotli on;
    brotli_comp_level 4; # Fast enough for on-the-fly
    brotli_types text/plain text/css application/javascript application/json;
}

When Level 11 Actually Makes Sense

The only time you should touch the "11" dial is during your build process.

If you have static assets—your CSS bundles, your compiled React code, your SVG sprites—you should pre-compress them. Since this happens at build time, it doesn't matter if it takes five minutes to compress. The server then just serves the .br file directly from the disk without thinking.

Here is a simple Node script you might run as part of a postbuild step to do this correctly:

const zlib = require('zlib');
const fs = require('fs');
const { pipeline } = require('stream');
const { promisify } = require('util');
const pipe = promisify(pipeline);

async function compressFile(filePath) {
  const compress = zlib.createBrotliCompress({
    params: {
      [zlib.constants.BROTLI_PARAM_QUALITY]: 11, // Max effort!
      [zlib.constants.BROTLI_PARAM_MODE]: zlib.constants.BROTLI_MODE_TEXT,
    },
  });
  
  const source = fs.createReadStream(filePath);
  const destination = fs.createWriteStream(`${filePath}.br`);
  
  await pipe(source, compress, destination);
  console.log(`Successfully pre-compressed: ${filePath}`);
}

// Run this on your /dist folder after building

In Nginx, you would then use the brotli_static module to serve these files:

location /static/ {
    brotli_static on; # Nginx looks for .br files first
}

The "1% Gain" Reality Check

We often get obsessed with the "Bytes Sent" metric because it's easy to see in the DevTools Network tab. But users don't care about bytes; they care about when the page becomes interactive.

If you’re running a high-traffic API and you’ve set Brotli to 11, you’re likely increasing your infrastructure costs because you need more CPU cores to handle the same request volume. You’re paying more money to make your app feel slower.

The rule of thumb:
- Dynamic Content: Use Brotli 4. It beats Gzip 6 on almost every metric without the latency hit.
- Static Assets: Use Brotli 11 at build time. Save every byte possible because the "cost" of compression is paid only once by your CI/CD pipeline, not by every user.

Stop letting your compression settings eat your performance gains. Check your middleware, look at your Nginx configs, and if you see an 11 in a dynamic context, change it. Your server (and your users) will thank you.