The Secret Life of JavaScript: Parallel Processing with Web Workers
The Secret Life of JavaScript: Parallel Processing with Web Workers
How to stop freezing your UI when parsing massive JSON
#JavaScript #Coding #Frontend #WebDev
Margaret is a senior software engineer. Timothy is her junior colleague. They work in a grand Victorian library in London — the kind of place where code quality is the unspoken objective, and craftsmanship is the only thing that matters.
Episode 26
The Freeze
Timothy watched the sleek CSS loading spinner twirl smoothly on his dashboard. It was a perfect sixty frames per second. But the moment the network request for the historical analytics data completed, the spinner violently froze mid-rotation. The entire browser tab locked up. For three agonizing seconds, Timothy couldn't scroll, couldn't click, and couldn't even highlight text.
Then, just as suddenly, the data populated the grid and the UI snapped back to life.
"I don't understand," Timothy muttered, aggressively clicking the refresh button to watch the freeze happen again. "We spent the last three days clearing the Main Thread. I used IntersectionObserver and ResizeObserver. I even used async/await for the data fetching so it wouldn't block. Why is my UI completely dead?"
The Async Illusion
Margaret leaned against the edge of his desk, taking a slow sip of her dark roast coffee. She watched the spinner freeze and thaw.
"You fell for the async illusion," Margaret said, pointing to his data fetching logic.
async function fetchAnalytics() {
// The network request is non-blocking...
const response = await fetch('/api/historical-data-50mb');
// ...but the parsing is entirely synchronous
const data = await response.json();
renderGrid(data);
}
"You are absolutely right that fetching the data over the network doesn't block the UI," Margaret explained. "But once that massive, fifty-megabyte payload arrives, the browser has to physically convert that giant string of text into a usable JavaScript object. That parsing operation is heavy, mathematical calculation. And the Main Thread can still only do one thing at a time."
"Think of the Main Thread as a single chef in a kitchen," she continued. "Using fetch is like the chef putting a pot of water on the stove—they can walk away and chop vegetables while waiting for it to boil. That is asynchronous. But calling response.json() on fifty megabytes of data is like someone dropping a truckload of fifty thousand carrots in the middle of the kitchen. The chef has to stop everything, pick up a knife, and chop every single one. No food gets cooked, and no CSS spinners get animated, until the chopping is done."
The Sous-Chef
Timothy rubbed his temples. "So how do I parse the data without forcing the chef to chop the carrots?"
"You hire a sous-chef and put them in a completely separate room," Margaret smiled. She uncapped a marker and wrote two words on the whiteboard: Web Workers.
"JavaScript is inherently single-threaded, but modern browsers give us an escape hatch. A Web Worker allows you to spin up a completely separate, parallel thread in the background. It has its own memory space and its own execution context. The Main Thread can simply hand the massive raw text payload over to the Worker, go right back to animating your spinner, and wait for the Worker to text back when the data is fully parsed."
The Parallel Thread
Timothy split his code into two files. First, he created the sous-chef in a new file named worker.js, making sure to add error handling just in case the JSON was malformed.
// worker.js - The Sous-Chef (Runs in a separate thread)
self.onmessage = function(event) {
try {
// 1. Receive the raw string payload from the Main Thread
const rawText = event.data;
// 2. Perform the massive, blocking parsing operation here
const parsedData = JSON.parse(rawText);
// 3. Send the neatly parsed object back
self.postMessage({ success: true, data: parsedData });
} catch (error) {
self.postMessage({ success: false, error: error.message });
}
};
Then, he updated his main application logic to offload the heavy lifting.
// main.js - The Executive Chef (Main Thread)
async function fetchAnalytics() {
const response = await fetch('/api/historical-data-50mb');
// Get raw string, do NOT parse it on the Main Thread
const rawText = await response.text();
// 1. Hire the sous-chef
const worker = new Worker('worker.js');
// 2. Hand the truckload of carrots over to the worker
worker.postMessage(rawText);
// 3. Listen for the finished product
worker.onmessage = function(event) {
if (event.data.success) {
renderGrid(event.data.data);
} else {
console.error('Parsing failed:', event.data.error);
}
// 4. Fire the sous-chef to free up system memory
worker.terminate();
};
}
The Unblocked UI
"Look at the delegation," Margaret said, watching him save both files. "The Main Thread never touches the heavy parsing logic. It acts strictly as an orchestrator, passing messages back and forth. Because the heavy lifting is completely isolated in the background, your UI remains completely untouched."
"For most applications, this is the perfect escape hatch," she added. "Though if your payload ever hits the hundreds of megabytes, you would want to look into streaming JSON parsers. And remember our previous lesson on Transferable Objects—if you ever shift from strings to raw ArrayBuffers, you can pass them to the Worker with zero memory copying."
Timothy hit refresh.
The network request fired. The raw payload arrived. But this time, the sleek CSS loading spinner never dropped a single frame. It twirled flawlessly, keeping the application feeling fast, fluid, and completely responsive while the Worker quietly chewed through the megabytes of text in the background. A few seconds later, the grid populated seamlessly.
The massive data had been parsed, and the Main Thread had barely lifted a finger.
Aaron Rose is a software engineer and technology writer at tech-reader.blog.
Catch up on the latest explainer videos, podcasts, and industry discussions below.


Comments
Post a Comment