The Secret Life of JavaScript: The Batch
Why yield has a cost, and how to optimize your streams.
Timothy sat back, satisfied. His Async Generator was humming.
On his screen, 50,000 user records were streaming in from the API. The memory usage was low. The app felt responsive. He had successfully replaced the "Bucket" with the "Hose."
"It's perfect," Timothy said.
Margaret leaned over his shoulder. She watched the CPU monitor. The fan on Timothy’s laptop was spinning audibly.
"It is functional," Margaret corrected. "But it is exhausting."
"Exhausting?" Timothy pointed to the code. "But I'm streaming! I'm not blocking the main thread!"
"You are streaming," Margaret agreed. "But you are moving a mountain of sand with a teaspoon."
The Cost of a Handshake
Margaret pulled up Timothy's generator code.
// Timothy's "teaspoon" code
for (const user of data.users) {
yield user; // Timothy is yielding 50,000 times
}
"Every time you type yield," Margaret said, "JavaScript has to pause your function, save its stack frame, bundle up the value, and hand it over to the consumer. Then, the consumer processes it, asks for the next one, and JavaScript has to wake your function up again."
She tapped the desk rhythmically. Pause. Resume. Pause. Resume.
"A single handshake takes microseconds," she explained. "But you are doing that handshake 50,000 times. That adds up to seconds of pure CPU overhead."
Timothy frowned. "I wonder... what if I yielded the whole page?"
"That's a good idea," Margaret smiled. "Put down the teaspoon. Pick up the shovel."
The Batch
Margaret deleted the inner loop.
async function* fetchUserBatches() {
let url = '/api/users?page=1';
while (url) {
const response = await fetch(url);
const data = await response.json();
// THE UPGRADE:
// Instead of unpacking the box, we hand over the whole box.
yield data.users;
url = data.nextPage;
}
}
"Wait," Timothy said. "Now fetchUserBatches isn't yielding a User anymore. It's yielding an Array of Users."
"Exactly," Margaret said. "If the API gives you 100 users per page, you just reduced your overhead by 100x. You only handshake once per page."
The Consumer
They updated the loop on the other side.
const userStream = fetchUserBatches();
for await (const batch of userStream) {
console.log(`Received a batch of ${batch.length} users!`);
// We process the whole shovelful at once
renderList(batch);
}
Timothy watched the monitor. The CPU fan slowed down. The "stutter" was gone.
Throughput vs. Latency
"There is a trade-off," Margaret warned. "Did you notice?"
Timothy looked closely. "The first user appeared a tiny bit later."
"Correct," Margaret nodded. "In your first version, you used a teaspoon. The first grain of sand arrived instantly, but moving the whole pile took forever."
"In this code version, you used a shovel. You had to wait to fill the shovel (Latency), but you moved the pile much faster (Throughput)."
"This is why frameworks like React batch their updates," she added. "Doing one thing at a time is expensive. Grouping them is efficient."
"So which one do I use?"
"That," Margaret said, "is the job. If you are streaming a chat response, use a teaspoon. If you are processing a database export, use a shovel."
Timothy looked at his code. It wasn't just working; it was flying.
"Thanks, Margaret," he said.
Aaron Rose is a software engineer and technology writer at tech-reader.blog and the author of Think Like a Genius.
.jpeg)

Comments
Post a Comment