
A Two-Way Passage for the Fetch Body
Why streaming data to your server has always been harder than receiving it, and the obscure property that finally unlocks bidirectional symmetry.
If you’ve ever pulled a 500MB JSON file from an API using a ReadableStream, you know the satisfaction of watching your memory usage stay flat as a pancake while the data flows through your app. We’ve had response streaming for years. But for a long time, the Fetch API felt like a one-way street. You could stream data *down* from the cloud with ease, but the moment you wanted to stream data *up*—say, a massive log file or a live video feed—the browser insisted on buffering the entire chunk of data into memory before hitting the wire.
It was an architectural ghost that haunted the spec for years. We had ReadableStream and WritableStream, but the fetch() body refused to accept them.
The Buffering Tax
Standard POST requests usually look like this:
const data = { message: "Hello server!" };
await fetch('/api/endpoint', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data)
});This is fine for a few kilobytes. But if you try to send a 2GB Blob, the browser has to keep that whole thing in memory. If you’re generating data on the fly—like encrypting a file or compressing a folder—you shouldn't have to wait for the whole process to finish before you start sending bytes.
The Secret Handshake: duplex: 'half'
Chromium finally unlocked request streaming, but they didn't just let us pass a stream to the body and call it a day. Because of some legacy baggage in the HTTP spec regarding how request and response headers interact, you have to explicitly tell the browser that you’re doing something a bit weird.
Enter the duplex property. Without it, your stream-based fetch will probably throw a TypeError.
const stream = new ReadableStream({
start(controller) {
controller.enqueue("Part 1 of the message...");
setTimeout(() => {
controller.enqueue("Part 2, sent after a delay!");
controller.close();
}, 2000);
}
});
// The magic happens in the options object
fetch('/upload', {
method: 'POST',
body: stream,
duplex: 'half', // This is the secret sauce
}).then(response => console.log('Upload started!'));Why 'half'? It indicates that the request and response are independent. You finish sending the request before you start fully processing the response. While full-duplex (sending and receiving at the exact same time over one connection) is theoretically possible in HTTP/2, the current implementation in browsers focuses on this "half" model to keep things stable.
A Practical Example: Streaming a Large Generator
Imagine you have a massive amount of data being generated in a loop—maybe you're calculating digits of Pi or reading from a local SQLite database. You don't want to store that in an array. You want to pipe it.
Here is how you can wrap a generator in a ReadableStream and throw it at a server:
function* dataGenerator() {
for (let i = 0; i < 1000; i++) {
yield `Record number ${i}\n`;
}
}
const stream = new ReadableStream({
async pull(controller) {
for (let chunk of dataGenerator()) {
controller.enqueue(new TextEncoder().encode(chunk));
}
controller.close();
}
});
async function uploadLogs() {
try {
const response = await fetch('/api/logs', {
method: 'POST',
headers: { 'Content-Type': 'text/plain' },
body: stream,
duplex: 'half'
});
if (response.ok) {
console.log("Logs streamed successfully.");
}
} catch (err) {
console.error("Streaming failed:", err);
}
}Why should you care?
Aside from the "cool factor" of making your network tab look like a futuristic data-pipe, there are three major wins:
1. Memory Efficiency: You can upload files larger than the available RAM on a mobile device.
2. Perceived Performance: The server can start processing the first byte while the client is still generating the last one. If the server finds an error in the first chunk, it can close the connection immediately, saving the user's data plan.
3. Transcoding on the fly: You can take a user's File object, pipe it through a TransformStream (to compress or encrypt it), and pipe the output directly into a fetch.
The "Gotchas" and Browser Support
It’s not all sunshine and rainbows. Request streaming currently only works over HTTP/2 or HTTP/3. If your server or the user's proxy forces a fallback to HTTP/1.1, the browser might struggle or revert to buffering because HTTP/1.1 doesn't handle multiplexing very gracefully.
Also, Safari and Firefox have been slower to adopt this. While it's in the spec, as of my last check, Chromium-based browsers (Chrome, Edge, Brave) are the ones leading the charge. If you’re building a production app, you absolutely need a fallback:
const supportsRequestStreaming = () => {
try {
return new Request('', {
method: 'POST',
body: new ReadableStream(),
duplex: 'half'
}).headers.has('Content-Type') === false; // Just a dummy check
} catch (e) {
return false;
}
};
if (supportsRequestStreaming()) {
// Go wild with streams
} else {
// Fall back to Blobs or manual chunking
}Closing the Loop
The web is moving toward a world where data isn't just a collection of "files" we toss back and forth, but a continuous flow. By mastering duplex: 'half', you're moving away from the old-school "wait-then-act" pattern and into a more reactive, efficient way of handling user data.
Go ahead, try piping a 1GB stream of "Hello World" to your local server. Your RAM will thank you.


