Building DynamoDB Streams: Capture Item Changes in Real Time (Vol. 2 – Streams)
Ready to build? Let’s roll up our sleeves and implement a real-time DynamoDB Stream with the AWS CLI. You’ll capture table updates, send them through a Lambda function, and verify events live — all in one clean workflow.
Step 1: Create a Table with Streams Enabled
You’ll start by creating a UserEvents table with a partition key (UserID) and a sort key (EventID). The stream will capture new and old item images whenever data changes.
aws dynamodb create-table \
--table-name UserEvents \
--attribute-definitions \
AttributeName=UserID,AttributeType=S \
AttributeName=EventID,AttributeType=S \
--key-schema \
AttributeName=UserID,KeyType=HASH \
AttributeName=EventID,KeyType=RANGE \
--billing-mode PAY_PER_REQUEST \
--stream-specification StreamEnabled=true,StreamViewType=NEW_AND_OLD_IMAGES
Output:
{
"TableDescription": {
"TableName": "UserEvents",
"TableStatus": "CREATING",
"StreamSpecification": {
"StreamEnabled": true,
"StreamViewType": "NEW_AND_OLD_IMAGES"
}
}
}
Check status until it’s active:
aws dynamodb describe-table --table-name UserEvents --query "Table.TableStatus"
Output:
"ACTIVE"
Retrieve the Stream ARN for later:
aws dynamodb describe-table --table-name UserEvents --query "Table.LatestStreamArn" --output text
Example Output:
arn:aws:dynamodb:us-east-1:123456789012:table/UserEvents/stream/2025-10-20T21:30:41.123
💡 Note: Copy this Stream ARN — you’ll use it in Step 3.
Step 2: Create a Lambda Function to Process Events
This Lambda function will process DynamoDB Stream events and print them to CloudWatch Logs.
Create a file named lambda_function.py locally:
import json
def handler(event, context):
print("Received event:")
print(json.dumps(event, indent=2))
return {"statusCode": 200}
Zip it:
zip function.zip lambda_function.py
Create an IAM role for Lambda:
aws iam create-role \
--role-name DynamoStreamRole \
--assume-role-policy-document '{
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": {"Service": "lambda.amazonaws.com"},
"Action": "sts:AssumeRole"
}]
}'
Attach permissions for DynamoDB Streams and CloudWatch:
aws iam attach-role-policy \
--role-name DynamoStreamRole \
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaDynamoDBExecutionRole
Now create the Lambda function.
Note: Replace 123456789012 with your actual AWS account ID in the --role ARN.
aws lambda create-function \
--function-name StreamLogger \
--runtime python3.12 \
--zip-file fileb://function.zip \
--handler lambda_function.handler \
--role arn:aws:iam::123456789012:role/DynamoStreamRole
Output:
{
"FunctionName": "StreamLogger",
"Runtime": "python3.12",
"Handler": "lambda_function.handler",
"State": "Active"
}
Step 3: Link the Stream to the Lambda Function
Next, link the DynamoDB Stream to your Lambda function. Use the Stream ARN you retrieved in Step 1.
aws lambda create-event-source-mapping \
--function-name StreamLogger \
--event-source arn:aws:dynamodb:us-east-1:123456789012:table/UserEvents/stream/2025-10-20T21:30:41.123 \
--starting-position LATEST
Output:
{
"UUID": "abcd1234-ef56-7890-gh12-ijkl3456mnop",
"State": "Creating",
"FunctionArn": "arn:aws:lambda:us-east-1:123456789012:function:StreamLogger"
}
Check status until mapping shows Enabled:
aws lambda list-event-source-mappings --function-name StreamLogger --query "EventSourceMappings[].State"
Output:
["Enabled"]
💡 Tip: Save the UUID above — you’ll need it if you want to pause or re-enable this mapping later.
Step 4: Insert and Update Data to Trigger the Stream
Insert, update, and delete items to generate stream events.
aws dynamodb put-item \
--table-name UserEvents \
--item '{"UserID": {"S": "USER#1001"}, "EventID": {"S": "LOGIN#001"}, "Device": {"S": "Chrome"}}'
aws dynamodb update-item \
--table-name UserEvents \
--key '{"UserID": {"S": "USER#1001"}, "EventID": {"S": "LOGIN#001"}}' \
--update-expression "SET Device = :d" \
--expression-attribute-values '{":d": {"S": "Firefox"}}'
aws dynamodb delete-item \
--table-name UserEvents \
--key '{"UserID": {"S": "USER#1001"}, "EventID": {"S": "LOGIN#001"}}'
Step 5: Verify Logs and Review Event Payloads
Fetch and review Lambda logs to see your captured stream events:
aws logs tail /aws/lambda/StreamLogger --since 5m
Output (abbreviated):
{
"eventID": "1",
"eventName": "INSERT",
"dynamodb": {
"NewImage": {
"UserID": {"S": "USER#1001"},
"EventID": {"S": "LOGIN#001"},
"Device": {"S": "Chrome"}
}
}
}
{
"eventID": "2",
"eventName": "MODIFY",
"dynamodb": {
"OldImage": {"Device": {"S": "Chrome"}},
"NewImage": {"Device": {"S": "Firefox"}}
}
}
{
"eventID": "3",
"eventName": "REMOVE",
"dynamodb": {
"Keys": {
"UserID": {"S": "USER#1001"},
"EventID": {"S": "LOGIN#001"}
}
}
}
💡 Because you used NEW_AND_OLD_IMAGES, the MODIFY event shows both before and after values — ideal for auditing or synchronization logic.
Step 6: Clean Up Resources
Delete the Lambda function, IAM role, and DynamoDB table.
aws lambda delete-function --function-name StreamLogger
aws iam delete-role --role-name DynamoStreamRole
aws dynamodb delete-table --table-name UserEvents
Confirm:
aws dynamodb list-tables
Output:
{"TableNames": []}
All resources are gone — stream, table, and mapping included.
Wrap-Up
You’ve built a fully operational event-driven DynamoDB system. Your Lambda now reacts in real time to inserts, updates, and deletions — a cornerstone of modern serverless design.
DynamoDB Streams let you build responsive applications, audit trails, and near-instant data synchronization pipelines without writing polling logic.
Pro Tip #1 — Event Replays
You can reprocess recent stream events by toggling the mapping:
aws lambda update-event-source-mapping --uuid <your-uuid> --enabled false
aws lambda update-event-source-mapping --uuid <your-uuid> --enabled true
Find the UUID in the output of Step 3 (create-event-source-mapping).
Pro Tip #2 — Stream View Types
Choose your stream view based on need:
KEYS_ONLY— smallest payload, for lightweight triggers.NEW_IMAGE— new state only.OLD_IMAGE— previous state only.NEW_AND_OLD_IMAGES— both before and after, best for audits and analytics.
Vol. 2 – Streams
Your DynamoDB Build Series continues — next, we’ll explore TTL or Conditional Writes to keep your dataset clean and predictable in real-world workflows.
Aaron Rose is a software engineer and technology writer at tech-reader.blog and the author of Think Like a Genius.
.jpeg)

Comments
Post a Comment