loke.dev
Header image for Stop Passing Blobs to SubtleCrypto: Why Your Encryption Logic Needs a TransformStream to Survive

Stop Passing Blobs to SubtleCrypto: Why Your Encryption Logic Needs a TransformStream to Survive

Most developers treat browser-side encryption as a single-shot operation, but for large files, the memory tax of SubtleCrypto will silently crash your user's tab unless you adopt a chunked streaming strategy.

· 5 min read

Imagine you’ve just built a sleek file-sharing app. A user drops a 2GB 4K video into the browser, your code calls window.crypto.subtle.encrypt(), and for a glorious three seconds, everything looks fine. Then, the browser tab turns into a ghost. The fans on the laptop spin up like a jet engine, and the "Aw, Snap!" error page makes its unwelcome debut.

The culprit isn't your UI or your choice of framework; it’s the fact that you treated a massive file like a single, digestible object. Most Web Crypto tutorials show you how to encrypt a BufferSource containing "Hello World," but they rarely mention that SubtleCrypto is a memory hog that insists on holding the entire plaintext and the resulting ciphertext in RAM simultaneously. If you're working with anything larger than a few hundred megabytes, you’re playing Russian Roulette with your user's browser memory.

The Memory Tax of One-Shot Encryption

When you pass a Blob or an ArrayBuffer to crypto.subtle.encrypt, the browser doesn't stream that data from the disk. It loads the whole thing into memory. If the file is 1GB, your tab is now using 1GB for the original data and *another* 1GB for the encrypted output.

Modern browsers have strict memory limits per tab. On mobile, this limit can be as low as a few hundred MBs. On desktop, you might hit a ceiling around 2GB to 4GB depending on the OS. If you exceed this, the browser process kills the tab.

The solution is to stop treating files as monolithic Blobs and start treating them as a flow of data using TransformStream.

Why SubtleCrypto Makes Streaming Hard

Here is the annoying part: the Web Crypto API does not natively support streaming. There is no encryptChunk() method. If you try to simply chop a file into chunks and encrypt them individually with AES-GCM, you’ll end up with a mess.

AES-GCM (the gold standard for web encryption) relies on a unique Initialization Vector (IV) and produces an authentication tag for every encryption operation. If you encrypt chunks blindly, you have to manage those IVs and tags for every single piece, or the recipient won't be able to decrypt them securely.

To fix this, we have to build a custom TransformStream that handles the chunking, IV increments, and data flow.

Building the Stream: A Practical Implementation

Let’s look at how to wrap SubtleCrypto inside a TransformStream. We’ll use a "chunked" approach where each chunk is encrypted independently with an incrementing counter or a random IV.

// A simple helper to increment an IV for GCM
function incrementIV(iv) {
  const view = new DataView(iv.buffer);
  for (let i = iv.length - 1; i >= 0; i--) {
    if (view.getUint8(i) < 255) {
      view.setUint8(i, view.getUint8(i) + 1);
      break;
    } else {
      view.setUint8(i, 0);
    }
  }
}

function createEncryptionStream(key, baseIv) {
  let currentIv = new Uint8Array(baseIv);

  return new TransformStream({
    async transform(chunk, controller) {
      // Each chunk is encrypted separately. 
      // Note: This adds a small overhead for the 16-byte GCM tag per chunk.
      const encryptedChunk = await window.crypto.subtle.encrypt(
        { name: "AES-GCM", iv: currentIv },
        key,
        chunk
      );
      
      // We need to send the IV or ensure the receiver knows how to increment it
      controller.enqueue(new Uint8Array(encryptedChunk));
      
      // Increment IV to prevent nonce reuse
      incrementIV(currentIv);
    }
  });
}

In this setup, we aren't loading the whole file. We're piping a ReadableStream through our TransformStream.

Putting it All Together

Here is how you actually use that stream with a file input. We use the .pipeThrough() and .pipeTo() methods to keep the memory footprint practically flat, regardless of file size.

async function encryptFile(file, key) {
  const iv = window.crypto.getRandomValues(new Uint8Array(12));
  const fileStream = file.stream();
  
  const encryptionStream = createEncryptionStream(key, iv);
  
  // We can pipe the output directly to a file system or a server upload
  const encryptedStream = fileStream.pipeThrough(encryptionStream);
  
  // Example: Consuming the stream (writing to a new file or uploading)
  const response = new Response(encryptedStream);
  const encryptedBlob = await response.blob(); 
  // Wait! Don't do .blob() for massive files. 
  // Use a library like StreamSaver.js or the File System Access API.
}

The "Gotchas" of Chunked Encryption

Streaming isn't a magic wand; it introduces a few complexities you need to account for:

1. Integrity Boundaries: Because AES-GCM adds a 16-byte authentication tag to every chunk, the resulting file will be larger than the original. Your decryption logic needs to know exactly how large those chunks were to slice them back up correctly.
2. Nonce Reuse: Never, ever use the same IV with the same key for different chunks. That’s why we increment the IV in the example above.
3. Authentication Tag Placement: If you encrypt a 10MB file in 1MB chunks, you get 10 authentication tags. If an attacker swaps Chunk #3 with a chunk from a different file encrypted with the same key, your code needs to detect that the sequence is broken. Adding a "chunk index" to the additionalData parameter of AES-GCM is a smart way to prevent this.

Decryption is the Mirror Image

When you decrypt, you follow the same streaming pattern. You read the fixed-size chunks (e.g., 64KB + 16 bytes for the tag), decrypt them one by one, and pipe the result to the user.

async function decryptStream(encryptedStream, key, startingIv) {
  let currentIv = new Uint8Array(startingIv);
  
  // You would need a custom transformer to handle slicing the 
  // encrypted stream into the correct chunk sizes (original chunk + 16 bytes)
  // ... similar TransformStream logic as encryption ...
}

Why This Matters for Your Users

The difference between a "Blob-based" app and a "Stream-based" app is the difference between a tool that crashes and a tool that works. When you use TransformStream, the browser only keeps a tiny fraction of the file in RAM at any given moment. This allows you to encrypt a 10GB file on a device with 2GB of RAM without a hitch.

The Web Crypto API is powerful, but it’s a low-level tool. It gives you the engine, but you have to build the fuel lines yourself. Moving to a streaming architecture isn't just an optimization—it's a requirement for building resilient, production-ready web applications. Stop passing Blobs. Start piping streams.