
Compression Streams Are the Easiest Way to Shrink Your Client-Side Data
Stop sending massive, uncompressed JSON blobs to your API when you can reduce your payload size by 90% using native browser streams.
Compression Streams Are the Easiest Way to Shrink Your Client-Side Data
Have you ever looked at a 5MB JSON payload heading toward your server and felt a sudden pang of guilt for your user's data plan?
We spend so much time optimizing our build bundles and minifying our CSS, yet we often shrug our shoulders when it comes to the actual data our applications send back and forth. Traditionally, if you wanted to compress something in the browser, you had to pull in a chunky library like pako or fflate. They work great, but adding 20KB of JavaScript just to shrink a string feels a bit like buying a second car to help move your first car.
Enter the Compression Streams API. It’s native, it’s fast, and it’s probably already sitting in the browser you’re using right now.
Why bother compressing on the client?
Usually, we let the server handle compression (Gzip or Brotli) for the *responses* it sends to the client. But the reverse—the *request*—is often ignored.
If you're building a tool that saves massive chunks of state, uploads huge logs, or syncs a local-first database to a cloud provider, you're likely sending a lot of repetitive, highly compressible text. JSON is notoriously bloated because it repeats keys over and over. A 10MB JSON blob can often shrink to less than 1MB with standard Gzip.
The One-Liner (Well, Close to It)
The API is built on top of the ReadableStream and WritableStream standards. If you haven't messed with streams before, they can feel a bit intimidating, but the pattern for compression is actually quite elegant.
Here is a utility function to turn any string into a compressed Uint8Array:
async function compressString(str, encoding = 'gzip') {
const stream = new Blob([str]).stream();
const compressionStream = new CompressionStream(encoding);
const compressedStream = stream.pipeThrough(compressionStream);
// Turn the stream back into a buffer
const response = new Response(compressedStream);
const buffer = await response.arrayBuffer();
return new Uint8Array(buffer);
}
// Usage
const bigData = JSON.stringify({ items: Array(10000).fill({ name: "Testing", id: 123 }) });
const compressed = await compressString(bigData);
console.log(`Original: ${bigData.length} bytes`);
console.log(`Compressed: ${compressed.length} bytes`);In the example above, we take a string, wrap it in a Blob (which conveniently gives us a stream), pipe it through the CompressionStream, and then use the Response object as a "sink" to gather all those chunks back into a single buffer. It’s clean, it doesn't require third-party dependencies, and it's remarkably fast.
Sending Compressed Data to Your API
You might think, "Great, I have a Uint8Array. How do I send this?"
Most modern servers (like Nginx or Node.js frameworks) know how to handle compressed requests if you tell them what's coming via headers. When you use fetch, you just pass the compressed body and set the Content-Encoding header.
async function uploadData(data) {
const compressedBody = await compressString(JSON.stringify(data));
const response = await fetch('/api/upload', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Encoding': 'gzip' // Crucial: tells the server to decompress this
},
body: compressedBody
});
return response.ok;
}Wait! A word of caution: Some hosting providers or serverless functions (looking at you, AWS Lambda with certain API Gateways) might automatically decompress the body for you, while others might require you to manually decompress it in your middleware. Always check your infrastructure's behavior before shipping this to production.
Decompressing is Just as Easy
If you’re fetching a massive compressed blob from a server that *doesn't* automatically decompress, or if you're pulling compressed data out of IndexedDB, you can use DecompressionStream.
async function decompressToJSON(compressedUint8Array, encoding = 'gzip') {
const stream = new Blob([compressedUint8Array]).stream();
const decompressionStream = new DecompressionStream(encoding);
const decompressedStream = stream.pipeThrough(decompressionStream);
const response = new Response(decompressedStream);
const text = await response.text();
return JSON.parse(text);
}When should you avoid this?
I’m a huge fan of this API, but don’t go putting it on every 2KB fetch request.
1. Small Payloads: If your JSON is only 5KB, the overhead of the compression algorithm and the extra CPU cycles on the user's device might actually result in a slower perceived experience.
2. Already Compressed Data: Don't try to Gzip a .jpg or a .zip file. They are already compressed. You’ll just waste CPU cycles to potentially make the file *larger*.
3. Old Browsers: This is a modern API. While Chrome, Firefox, and Safari have supported it for a while, if you are stuck supporting older versions of Edge (pre-Chromium) or Internet Explorer (my condolences), you'll need a polyfill.
The "Stream" in Compression Streams
The real magic happens when you deal with data so large it shouldn't fit in memory. Because this is a stream-based API, you can theoretically pipe a file from a user's disk (via File.stream()) directly into a compression stream and then directly into a fetch request without ever loading the whole uncompressed file into RAM.
That’s the difference between an app that feels "pro" and one that crashes the browser tab because it tried to stringify a 200MB log file.
Give it a try next time you're about to dump a giant object into a POST request. Your users’ data plans (and your server’s ingress costs) will thank you.


