The Secret Life of JavaScript: The Compressor

 

The Secret Life of JavaScript: The Compressor

Native compression with the Streams API

#JavaScript #WebPerformance #StreamsAPI #Compression




Margaret is a senior software engineer. Timothy is her junior colleague. They work in a grand Victorian library in London — the kind of place where code quality is the unspoken objective, and craftsmanship is the only thing that matters.

Episode 33

The Heavy Upload

Timothy was staring at the network tab in DevTools, watching an upload progress bar crawl at an agonizing pace. He had built a feature to let users submit their local diagnostic logs back to the server for analysis. But the 15-megabyte plain text files were crippling the application on slow connections.

"Uploading raw text is too slow," Timothy said as Margaret walked up with her morning dark roast. "I am trying to implement a third-party zipping library to compress the logs before we send them, but it's causing the main thread to freeze."

The Redundant Payload

Margaret looked at the massive import statement at the top of his file.

"How large is that compression library you are importing?" she asked.

"About 45 kilobytes," Timothy replied. "It's a lot of math, but it's the only way to shrink the payload on the client side."

Margaret took a sip of her coffee. "Think about how our web server delivers our main JavaScript bundle to the browser. Does it send raw text?"

"No," Timothy answered. "The server compresses it with GZIP, and the browser decompresses it on the fly when it arrives."

"Exactly," Margaret said. "So, you acknowledge that the JavaScript engine has a native, deeply integrated understanding of the GZIP algorithm."

Timothy nodded.

"Then why," Margaret asked gently, "are you forcing the user to download a 45-kilobyte JavaScript library just to teach the browser how to do something it already knows how to do?"

Timothy paused, looking back at his code. "I didn't know the browser exposed its internal compression engine to us."

Piping the Stream

Margaret picked up a dry-erase marker. "For a long time, it didn't. But you've spent the last few weeks mastering the Streams API. You learned how to read incoming streams packet by packet. Now, it is time to look at the other direction."

She wrote CompressionStream and DecompressionStream on the whiteboard.

"Because streams operate in continuous chunks," Margaret explained, "we don't have to compress the entire 15-megabyte file at once. We can take our raw data stream, pipe it directly through the browser's native compression engine—whether you choose GZIP or even Brotli for denser text compression—and funnel the compressed output straight into our fetch request. And if you ever need to manually handle compressed responses from the server, you can pipe them right back through the decompression stream."

"Without blocking the main thread?" Timothy asked.

"Completely asynchronously," she confirmed. "The browser handles the heavy lifting in the background, utilizing highly optimized native code instead of a bloated JavaScript polyfill."

The Native Advantage

Following Margaret's lead, Timothy deleted the heavy third-party library. He took his raw text payload, converted it into a readable stream, and used the pipeThrough() method to seamlessly pass the data through the native compressor, adding a try/catch block for production-grade error handling.

// main.js - Compressing Uploads Natively
async function uploadDiagnosticLogs(logText) {
  // 1. Convert the massive raw string into a stream of bytes
  const rawStream = new Blob([logText]).stream();
  
  // 2. Pipe the raw stream through the browser's native GZIP engine
  const compressedStream = rawStream.pipeThrough(new CompressionStream('gzip'));
  
  try {
    // 3. Send the compressed stream directly to the server
    await fetch('/api/upload-logs', {
      method: 'POST',
      body: compressedStream,
      headers: {
        // Tell the server the incoming payload is compressed
        'Content-Encoding': 'gzip',
        'Content-Type': 'text/plain'
      }
    });
  } catch (error) {
    console.error('Upload failed:', error);
  }
}

Timothy saved the file and throttled his DevTools network connection to simulate a slow 3G cellular network. He clicked the upload button.

There was no UI freeze. There was no massive JavaScript library downloaded. The browser silently piped the raw text through the native stream, compressing the 15 megabytes of data into a tiny fraction of its original size on the fly.

"As long as the backend is configured to accept compressed payloads—which is just a simple middleware flag in frameworks like Express or Fastify—this will work flawlessly," Margaret noted.

The upload finished in milliseconds. By trusting the platform and abandoning his assumptions, Timothy had achieved lightning-fast uploads while simultaneously making his JavaScript bundle smaller.


Aaron Rose is a software engineer and technology writer at tech-reader.blog

Catch up on the latest explainer videos, podcasts, and industry discussions below.


Comments

Popular posts from this blog

Insight: The Great Minimal OS Showdown—DietPi vs Raspberry Pi OS Lite

The New ChatGPT Reason Feature: What It Is and Why You Should Use It

Running AI Models on Raspberry Pi 5 (8GB RAM): What Works and What Doesn't