The Secret Life of JavaScript: NDJSON
The Secret Life of JavaScript: NDJSON
How to stream complex data with NDJSON
#JavaScript #Frontend #StreamsAPI #NDJSON
Margaret is a senior software engineer. Timothy is her junior colleague. They work in a grand Victorian library in London — the kind of place where code quality is the unspoken objective, and craftsmanship is the only thing that matters.
Episode 31
The Broken Objects
Timothy stared at his console, watching a waterfall of bright red text flood the screen. Every single line read: SyntaxError: Unexpected end of JSON input.
"The Streams API is incredibly fast," Timothy explained to Margaret as she walked up with her morning dark roast coffee. "We completely eliminated the memory buffer for the text logs. But now, I am trying to progressively render a massive payload of 100,000 user records from our database. Because the data is JSON, the network packets are slicing the objects right in half."
Margaret looked at the network inspector.
"Packet one contains half of a user profile," Timothy sighed. "Packet two contains the other half. When the stream processes packet one and attempts to parse it, the JavaScript engine panics because the JSON is incomplete."
The Impossible Boundary
"Why are you trying to parse a fragment, Timothy?" Margaret asked, taking a sip of her coffee.
"Because I want to render the user profiles progressively, exactly like we did with the server logs," Timothy replied. "I don't want the user to stare at a blank screen."
"That is the correct architectural goal," Margaret nodded. "But what defines the boundary of a complete object inside a standard JSON array?"
Timothy thought for a moment. "A comma separating the items. Or a closing curly brace."
"And is it easy or efficient for a stream reader to safely identify the correct closing brace when you are dealing with deeply nested, complex data structures?" Margaret pressed.
Timothy shook his head. "No. It is a parsing nightmare. You would have to count opening and closing brackets on the fly just to know when you had a complete object."
The Delimiter
Margaret uncapped a dry-erase marker and stepped over to the whiteboard. She wrote six letters: NDJSON.
"Welcome to Newline Delimited JSON," Margaret said. "The problem isn't your stream reader; the problem is the format of the data. Standard JSON arrays are meant to be parsed all at once. If you want to stream data, you need a format designed for streaming."
She stepped back and drew a simple visual comparison on the whiteboard:
Standard JSON: [{"id":1},{"id":2},{"id":3}] ← One rigid block, breaks if sliced
NDJSON: {"id":1}\n{"id":2}\n{"id":3}\n ← Safely chunkable at every newline
"This pattern requires the backend to send newline-separated JSON objects instead of a single, monolithic array," Margaret explained. "Instead of sending one massive block, the server formats the response so that every single user profile is a perfectly complete, standalone JSON object placed on its own line."
The Accumulator
"How does that fix the packet slicing?" Timothy asked.
"Because a newline character is a universal, unambiguous boundary," Margaret explained. "You still read the raw stream exactly as it arrives over the network. But instead of trying to parse every chunk immediately, you simply accumulate the incoming text into a small, temporary string buffer. The millisecond you detect a newline character in that buffer, you know with absolute certainty that everything before it is a complete, valid JSON object."
Timothy's eyes lit up as the logic clicked into place. "I slice off that complete string, parse it into an object, and render it to the DOM. Then I keep accumulating the rest of the stream until the next newline appears."
Timothy refactored his logic. Following the accumulator pattern, he implemented a buffer that collected the incoming stream chunks, split them at every \n character, and included a try/finally block for production hygiene.
// main.js - Accumulating chunks until a complete line is found
async function fetchUsers() {
const response = await fetch('/api/users-ndjson');
const reader = response.body.getReader();
const decoder = new TextDecoder('utf-8');
let buffer = ''; // Temporary accumulator for incomplete chunks
try {
while (true) {
const { done, value } = await reader.read();
if (done) break;
// Append the new decoded chunk to the existing buffer
buffer += decoder.decode(value, { stream: true });
// Split the buffer by newline characters
const lines = buffer.split('\n');
// The last item in the array is an incomplete chunk (no newline yet).
// Pop it off the array and leave it in the buffer for the next loop.
buffer = lines.pop();
for (const line of lines) {
if (line.trim()) {
const user = JSON.parse(line); // Safely parse the complete object
renderUserGrid(user); // Instantly render to the DOM
}
}
}
} finally {
// Always release the reader lock
reader.releaseLock();
}
}
He refreshed the dashboard. The loading spinner vanished instantly. The data grid began populating row by row, pulling the user profiles out of the stream and rendering them flawlessly. There were no memory bottlenecks, no blank screens, and absolutely no syntax errors. By respecting the boundaries of the data, Timothy had achieved true progressive rendering for complex objects.
Aaron Rose is a software engineer and technology writer at tech-reader.blog.
Catch up on the latest explainer videos, podcasts, and industry discussions below.
.jpeg)

Comments
Post a Comment