Build: Run AWS Commands in a Python Script Using LocalStack


Build: Run AWS Commands in a Python Script Using LocalStack







Build a Python Menu for S3 File Actions

If you’ve been playing with LocalStack from the command line, it’s time to level up. Let’s write a Python script that uses Boto3 to talk to your local AWS emulation — all from a friendly little menu. We'll show you how to create this script, where to place it, and how to run it inside your virtual environment.

The script you’ll build lets you upload and download files from a simulated S3 bucket — just like the real thing. You’ll also be able to list files locally and inside your mock S3 environment. This gives you a fully self-contained AWS experience, all running on your local machine.

Here’s what the program looks like once launched: 


Bash
Welcome to the LocalStack S3 File Manager

1. Upload file to S3
2. Download file from S3
3. List local files
4. List files in S3 bucket
5. Exit
Enter choice:   

In this tutorial, you'll install Docker, set up a Python virtual environment, install LocalStack and its CLI tools, create an S3 bucket, and finally run the Python script to move files between your local filesystem and the bucket. Let’s dive in.


Step 1: Install Docker (System-Wide)

LocalStack runs inside Docker, so your first move is to install Docker outside any Python environment.

Update and upgrade your system packages: 

Bash
sudo apt update && sudo apt upgrade -y   

Install Docker: 

Bash
sudo apt install docker.io -y   

Add your user to the Docker group so you can run Docker without sudo: 

Bash
sudo usermod -aG docker $USER   

Reboot or restart your Linux session to activate the new group.


Step 2: Set Up a Python Virtual Environment

This keeps your LocalStack CLI isolated and avoids the externally-managed-environment pip error.

Install the required Python tools: 


Bash
sudo apt install python3-venv python3-pip -y   

Create a virtual environment for LocalStack: 

Bash
python3 -m venv localstack-venv   

Activate the virtual environment: 

Bash
source localstack-venv/bin/activate   

Install LocalStack CLI and supporting tools inside the venv: 

Bash
pip install --no-cache-dir localstack awscli awscli-local boto3   

Confirm the installation: 

Bash
localstack --version   

Check that awslocal installed correctly: 

Bash
awslocal --version   


Step 3: Launch LocalStack in One Terminal

Open your first terminal.

Activate the venv: 

Bash
source ~/localstack-venv/bin/activate   

Start LocalStack in foreground mode to see live logs: 

Bash
localstack start   

The terminal will stay open and stream logs. Wait for the Ready. message.


Step 4: Run CLI Commands in a Second Terminal

Open a second terminal window.

Activate the venv again: 

Bash
source ~/localstack-venv/bin/activate   

Now create your first local S3 bucket: 

Bash
awslocal s3 mb s3://my-test-bucket   

If you see make_bucket: my-test-bucket, you’re golden.

Look back at Terminal 1—you'll see log output like: 

Bash
AWS s3.CreateBucket => 200   

That's proof your local AWS environment is fully operational.


Step 5: Write the Python Script in a Third Terminal

Open a third terminal window.

Activate your venv once more: 

Bash
source ~/localstack-venv/bin/activate   

Use nano or vi to create the script: 

Bash
nano s3_menu.py   

Paste the following Python code:

Python
import boto3
import os

# Connect to the LocalStack S3 endpoint using simulated credentials
s3 = boto3.client(
    "s3",
    endpoint_url="http://localhost:4566",
    aws_access_key_id="test",
    aws_secret_access_key="test",
    region_name="us-east-1"
)

bucket_name = "my-test-bucket"

# Function to list local files in current directory
def list_local_files():
    print("\nLocal files:")
    for f in os.listdir("."):
        if os.path.isfile(f):
            print(f"- {f}")

# Function to list objects stored in the S3 bucket
def list_s3_files():
    print(f"\nFiles in S3 bucket '{bucket_name}':")
    response = s3.list_objects_v2(Bucket=bucket_name)
    if 'Contents' in response:
        for obj in response['Contents']:
            print(f"- {obj['Key']}")
    else:
        print("(No files found)")

# Upload a file from local disk to the S3 bucket
def upload_file():
    filename = input("Enter filename to upload (from current directory): ")
    if os.path.isfile(filename):
        with open(filename, "rb") as f:
            s3.put_object(Bucket=bucket_name, Key=filename, Body=f)
        print(f"Uploaded {filename} to S3.")
    else:
        print("File does not exist.")

# Download a file from the S3 bucket to local disk
def download_file():
    filename = input("Enter filename to download from S3: ")
    try:
        response = s3.get_object(Bucket=bucket_name, Key=filename)
        with open(filename, "wb") as f:
            f.write(response['Body'].read())
        print(f"Downloaded {filename} from S3.")
    except s3.exceptions.NoSuchKey:
        print("File not found in S3.")

# Main interactive loop
while True:
    print("\nWelcome to the LocalStack S3 File Manager")
    print("1. Upload file to S3")
    print("2. Download file from S3")
    print("3. List local files")
    print("4. List files in S3 bucket")
    print("5. Exit")

    choice = input("Enter choice: ")

    if choice == "1":
        upload_file()
    elif choice == "2":
        download_file()
    elif choice == "3":
        list_local_files()
    elif choice == "4":
        list_s3_files()
    elif choice == "5":
        print("Goodbye!")
        break
    else:
        print("Invalid choice. Try again.")   


Step 6: Run the Python Script

Now it's type to run the Python script and use it:

Bash
python s3_menu.py   


Sample Output from a Real Session

Bash
Welcome to the LocalStack S3 File Manager

1. Upload file to S3
2. Download file from S3
3. List local files
4. List files in S3 bucket
5. Exit
Enter choice: 1
Enter filename to upload (from current directory): hello.txt
Uploaded hello.txt to S3.

Enter choice: 4

Files in S3 bucket 'my-test-bucket':
- hello.txt

Enter choice: 2
Enter filename to download from S3: hello.txt
Downloaded hello.txt from S3.

Enter choice: 3

Local files:
- hello.txt
- s3_menu.py   


Recap: What Are the Three Terminals Doing? 

  • Terminal 1 - Run LocalStack and stream logs
  • Terminal 2 - Use awslocal CLI (e.g., mb)
  • Terminal 3 - Write and run the Python script


Graceful Shutdown and Cleanup

Terminal 3 (Python):
  1. Choose "exit" to quite the script if it is still running deactivate 
  2. Exit (deactivate) the Python virtual environment:

Bash
deactivate   


Terminal 2 (CLI): 

Remove the S3 bucket and contents:

Bash
awslocal s3 rb s3://my-test-bucket --force   

Exit LocalStack:

Bash
exit   

Quit (deactivate) the Python virtual environment:

Bash
deactivate   


Terminal 1 (LocalStack): 

Type Ctrl C to stop LocalStack logs and services 

Quit (deactivate) the Python virtual environment:

Bash
deactivate   


Conclusion

Next time, we can expand this with folders, object metadata, or even logging. But for now? You just took command of a local AWS environment with pure Python. 🚀📃


Need AWS Expertise?

We'd love to help you with your AWS projects.  Feel free to reach out to us at info@pacificw.com.


Written by Aaron Rose, software engineer and technology writer at Tech-Reader.blog.

Comments

Popular posts from this blog

The New ChatGPT Reason Feature: What It Is and Why You Should Use It

Raspberry Pi Connect vs. RealVNC: A Comprehensive Comparison

The Reasoning Chain in DeepSeek R1: A Glimpse into AI’s Thought Process