The Day My Code Brought Down Black Friday (And How Python Generators Saved My Career)

 

The Day My Code Brought Down Black Friday (And How Python Generators Saved My Career)

How a memory disaster taught me to think in streams, not batches





It was 6:47 AM on Black Friday. I was three months into my first developer job at ShopSmart, clutching my third cup of coffee and watching our website's traffic dashboard climb toward what we hoped would be record-breaking numbers.

Then my Slack lit up like a Christmas tree.

"Site's down."
"Everything's timing out."
"Revenue dashboard showing zero."
"What the hell is happening?"

My stomach dropped. I had deployed the new product recommendation feature just two days earlier—my first major contribution to the production system. The feature was supposed to show customers "products you might like" based on their purchase history and similar customers' behavior.

The Moment of Truth

Sarah, our senior engineer, was already pulling up server logs when I stumbled over to her desk.

"Memory usage is through the roof," she said, pointing at graphs that looked like rocket ships heading to space. "Our main server is struggling with an out-of-control memory leak."

I watched the logs scroll by, and there it was—my code. Over and over again:

ProductRecommendationService.generate_recommendations()
Loading customer data... 2.1GB
Loading purchase history... 4.7GB
Loading similar customers... 8.2GB
OutOfMemoryError: Java heap space

My feature was trying to load data for all 2.8 million customers into memory at once, just to generate recommendations for whoever was currently browsing the site.

"Is this your code, Alex?" Sarah asked gently.
I nodded, feeling like I might throw up.

The Problem I Created

Here's what my "clever" recommendation system was doing:

def create_recommendations_badly():
    # Load ALL customers into memory at once
    all_customers = load_every_customer_from_database()
    all_purchases = load_every_purchase_from_database() 
    all_products = load_every_product_from_database()

    # Now try to process 2.8 million customers
    recommendations = analyze_all_customer_behavior(all_customers, all_purchases)
    return recommendations

I thought I was being efficient by loading everything once. Instead, I created a memory monster that devoured our server's resources and brought down the entire site on the biggest shopping day of the year.
The irony? Most customers only needed recommendations for 5-10 products. I was loading millions of records to serve a handful of suggestions.

The Rescue Mission

While I sat there contemplating my very short career in tech, Sarah was already fixing my mess.

"Alex, watch this," she said, rewriting my code in real-time. "Instead of loading everything at once, we'll process one customer at a time."

def create_recommendations_smartly(current_customer_id):
    # Process customers one at a time, not all at once
    for customer in find_similar_customers(current_customer_id):
        for purchase in get_customer_purchases(customer):
            if purchase_matches_preferences(purchase, current_customer_id):
                yield create_recommendation(purchase)

"See the yield keyword?" she asked. "This is called a generator. Instead of creating a giant list of all recommendations, it creates them one at a time, only when we actually need them."

She deployed the fix. Within minutes, our memory usage dropped from 30GB to under 2GB. The site came back online. Black Friday was saved.

The Lesson That Changed Everything

Later that afternoon, after the crisis had passed and we'd hit record sales numbers, Sarah sat down with me to explain what had really happened.

"The problem wasn't that you made a mistake," she said. "The problem was how you were thinking about data."

She drew two diagrams on the whiteboard:

My Approach (Bad):
Load ALL data → Process ALL data → Return ALL results

Her Approach (Good):
Load ONE item → Process ONE item → Return ONE result → Repeat

"Think of it like a restaurant," she explained. "You don't cook every possible meal at the start of the day and keep them all warm. My code was trying to pre-cook every item on the menu for every possible customer, even those who weren't there yet. A good chef prepares each dish fresh when ordered. That's what generators do—they create data on demand."
The elegance hit me immediately.

The Real-World Magic

To show me how powerful this approach was, Sarah walked me through a simple example:

def recommend_products_for_customer(customer_id):
    # Find customers with similar buying patterns
    for similar_customer in find_similar_customers(customer_id):

        # Look at what they bought recently  
        for recent_purchase in get_recent_purchases(similar_customer):

            # If this customer hasn't bought it yet, recommend it
            if customer_hasnt_bought(customer_id, recent_purchase):
                yield recent_purchase

# Usage: Get recommendations as you need them
for product in recommend_products_for_customer(12345):
    display_recommendation(product)

    # Only need 5 recommendations? Stop here.
    if recommendation_count >= 5:
        break

"See how readable that is?" she asked. "A 12-year-old could understand what's happening. We find similar customers, look at their purchases, and recommend things our customer hasn't bought yet. Simple."
The beauty wasn't just in the memory efficiency—it was in the clarity. The code read like a recipe for solving the problem.

The Growth That Followed

That Black Friday disaster became the turning point in my career. Not because I learned about generators (though that was important), but because I learned how to think differently about problems.

Before: "How do I get all the data I might need?"
After: "How do I get just the data I actually need, when I need it?"

This mindset shift transformed how I approached every coding challenge. Instead of loading entire datasets, I started thinking in streams. Instead of processing everything upfront, I learned to process on-demand.

Six months later, I rewrote our entire analytics system using the same principles. Instead of daily batch jobs that processed terabytes of data, we had real-time streams that processed individual events as they happened. The system was faster, used 90% less memory, and gave our business team insights in real-time instead of waiting until the next day.

The Lesson for Your Career

Every developer has a "Black Friday moment"—a time when your code breaks something important and teaches you a fundamental lesson about how to think differently.
Mine taught me that the most elegant solutions often use the least resources. That readable code is more valuable than clever code. And that the best way to handle big problems is often to break them into the smallest possible pieces.

The generator pattern isn't just a Python feature—it's a way of thinking that applies to every aspect of software development. Whether you're processing data, handling user requests, or designing system architectures, ask yourself: "Do I really need everything at once, or can I handle this one piece at a time?"

Your future self—and your company's Black Friday sales—will thank you for it.

The Code That Saved the Day

For those interested, here's the exact pattern that rescued our Black Friday and has served me well ever since:

def solve_big_problem_in_small_pieces(big_problem):
    for small_piece in break_problem_into_pieces(big_problem):
        solution_piece = solve_small_piece(small_piece)
        yield solution_piece

# Handle solutions one at a time as they're ready
for solution in solve_big_problem_in_small_pieces(my_challenge):
    apply_solution(solution)

Simple, readable, and scalable. Sometimes the best code is the code that does the least.


Aaron Rose is a software engineer and technology writer at tech-reader.blog and the author of Think Like a Genius.

Comments

Popular posts from this blog

The New ChatGPT Reason Feature: What It Is and Why You Should Use It

Insight: The Great Minimal OS Showdown—DietPi vs Raspberry Pi OS Lite

Raspberry Pi Connect vs. RealVNC: A Comprehensive Comparison