The Secret Life of JavaScript: The Async Generator
How to handle streams of data with for await...of.
Timothy was rubbing his temples. On his screen was a function that looked like it had been fighting a losing battle.
async function getAllUsers() {
let url = '/api/users?page=1';
const allUsers = [];
while (url) {
const response = await fetch(url);
const data = await response.json();
// Add this page's users to our big list
allUsers.push(...data.users);
// Prepare for the next loop... if there is one
url = data.nextPage;
}
return allUsers;
}
"I'm trying to download all the user data," Timothy explained to Margaret. "But there are 50,000 users. If I wait for all the pages to download before I start processing them, the user waits for 20 seconds. It feels... stuck."
Margaret nodded. "You are treating a Stream like a Bucket," she said.
"You are trying to collect every single drop of water before you let anyone drink," she continued. "Why not let them drink from the hose?"
The Hybrid
Margaret wrote a new syntax on the board. It combined the two most powerful keywords in the language.
async function* fetchUsers() { ... }
"Async meets Generator," she said. "The async allows us to wait for the network. The * allows us to yield data one piece at a time."
She rewrote Timothy's code, adding a safety net.
async function* fetchUsers() {
let url = '/api/users?page=1';
while (url) {
try {
const response = await fetch(url);
const data = await response.json();
// Instead of building a massive array, we deliver this page immediately
for (const user of data.users) {
yield user;
}
url = data.nextPage;
} catch (error) {
console.error("Stream interrupted", error);
return; // Stop the stream safely
}
}
}
Timothy looked at the code. "It looks similar," he admitted. "But how do I use it? The data isn't all there yet."
The Magic Loop (for await...of)
"This is where the magic happens," Margaret said. "We need a loop that knows how to wait."
She wrote the consumer code:
const userStream = fetchUsers();
for await (const user of userStream) {
console.log("Processing:", user.name);
// This loop automatically PAUSES while the next page downloads!
}
Timothy watched the console simulation.
- The loop prints 10 users instantly.
- The loop pauses (while the network fetches Page 2).
- The loop wakes up and prints 10 more users.
"The pause is invisible," Timothy whispered.
"Exactly," Margaret said. "The code inside the loop doesn't know it is waiting. It just asks for the next user, and JavaScript handles the pause. You aren't processing a Memory Snapshot; you are processing Time."
The Emergency Brake
"One last thing," Margaret added, lowering her voice to a whisper. "In the real world, streams can be endless. Sometimes the user navigates away before you are done."
"What do I do?"
"You use an AbortController," she said. "It allows you to cut the hose. Always design your streams so they can be stopped."
The Conclusion
Timothy deleted his allUsers array. He didn't need the bucket anymore.
"It feels lighter," Timothy said. "I'm not hoarding data."
"That is the Zen of the Async Generator," Margaret smiled. "Don't carry the weight of the future. Just handle what is in front of you, right now."
Aaron Rose is a software engineer and technology writer at tech-reader.blog and the author of Think Like a Genius.
.jpeg)

Comments
Post a Comment