Solve: Testing Azure Blob Storage in LocalStack—A Complete Guide Introduction


Solve: Testing Azure Blob Storage in LocalStack—A Complete Guide Introduction







LocalStack has become an invaluable tool for developers who want to test cloud services locally without incurring costs or dealing with complex cloud setups. While LocalStack's AWS emulation is mature and well-documented, its Azure support is still in alpha—which means it's functional but comes with some important caveats.

In this guide, we'll walk through testing Azure Blob Storage functionality using LocalStack, and more importantly, we'll show you what to do when things don't work as expected. You'll learn how to set up a local testing environment, attempt to use Terraform with Azure providers, and implement a practical workaround when the official approach hits roadblocks.

What you'll learn:
  • How to configure LocalStack for Azure Blob Storage emulation
  • Why the Azure Terraform provider doesn't work with LocalStack (yet)
  • A practical S3-based workaround that simulates Azure Blob functionality
  • Best practices for local cloud development and testing
Let's dive in.


Prerequisites

Before we start, make sure you have:
  • Docker installed and running
  • Python 3.x with pip
  • Basic familiarity with Terraform
  • A Unix-like terminal (Linux, macOS, or WSL)

Step-by-Step Implementation

Step 1: Create a Python Virtual Environment

First, let's set up an isolated Python environment for our testing: 

bash
python3 -m venv ~/envs/azure-localstack 
source ~/envs/azure-localstack/bin/activate

This creates an isolated environment that won't interfere with your system Python packages—a crucial practice for any development work.

Step 2: Install Required Python Packages

Install the packages we'll need to interact with LocalStack's blob backend: 

bash
pip install boto3 requests

These libraries will allow us to programmatically interact with the storage services.

Step 3: Start LocalStack with Azure and S3 Services

Launch LocalStack with both Azure and S3 services enabled: 

bash
docker run -it -p 4566:4566 -e SERVICES=azure,s3 localstack/localstack

Important: Keep this terminal open—LocalStack needs to stay running throughout our testing process.

Step 4: Verify LocalStack is Running

Open a new terminal and activate your Python environment: 

bash
source ~/envs/azure-localstack/bin/activate

Then check that LocalStack is healthy: 

bash
curl http://localhost:4566/_localstack/health

You should see output confirming that both s3 and azure services are running.

Step 5: Test Blob Operations with Python

Create a test script to verify blob upload and retrieval functionality: 

bash
nano azure_blob_test.py

Add this Python code: 

python
import boto3

# Configure S3 client to use LocalStack
s3 = boto3.client("s3", endpoint_url="http://localhost:4566",
                  aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1")

# Create a container (bucket in S3 terms)
s3.create_bucket(Bucket="azure-test-container")

# Upload a blob (object in S3 terms)
s3.put_object(Bucket="azure-test-container", Key="hello.txt", Body="Hello from Azure test")

# Retrieve and display the blob
resp = s3.get_object(Bucket="azure-test-container", Key="hello.txt")
print(resp["Body"].read().decode())

Run the test: 

bash
python azure_blob_test.py

Expected output: 

bash
Hello from Azure test

This confirms that blob emulation is working through LocalStack's S3 backend.

Step 6: Install Compatible Terraform Version

Ensure you have a recent version of Terraform that supports current providers: 

bash
sudo apt-get remove terraform
curl -O https://releases.hashicorp.com/terraform/1.8.5/terraform_1.8.5_linux_amd64.zip
unzip terraform_1.8.5_linux_amd64.zip
sudo mv terraform /usr/local/bin/

Step 7: Attempt Azure Provisioning with Terraform

Now let's try the "official" approach with the Azure provider: 

bash
mkdir ~/azure-localstack-tf && cd ~/azure-localstack-tf
nano main.tf

Create a standard Azure Terraform configuration: 

hcl
terraform {
  required_providers {
    azurerm = {
      source  = "hashicorp/azurerm"
      version = "~> 3.75"
    }
  }
  required_version = ">= 1.3.0"
}

provider "azurerm" {
  features {}
  skip_provider_registration = true

  client_id       = "test"
  client_secret   = "test"
  tenant_id       = "test"
  subscription_id = "test"
}

resource "azurerm_resource_group" "localstack_rg" {
  name     = "localstack-rg"
  location = "eastus"
}

resource "azurerm_storage_account" "example" {
  name                     = "azurestoragetest"
  resource_group_name      = azurerm_resource_group.localstack_rg.name
  location                 = azurerm_resource_group.localstack_rg.location
  account_tier             = "Standard"
  account_replication_type = "LRS"
}

resource "azurerm_storage_container" "example" {
  name                  = "testcontainer"
  storage_account_name  = azurerm_storage_account.example.name
  container_access_type = "private"
}

Try to initialize and plan: 

bash
terraform init
terraform plan

❌ You'll encounter this error: 

bash
Error: building account: ... Specified tenant identifier 'test' is neither a valid DNS name...

The Reality Check: The azurerm provider attempts to authenticate with real Azure services and cannot be redirected to LocalStack, despite what some documentation might suggest. This is a limitation of LocalStack's current Azure alpha implementation.

Step 8: Implement the S3 Workaround

Since the direct Azure approach doesn't work, let's create a practical workaround using S3 to simulate Azure Blob functionality: 

bash
mkdir ~/azure-s3-workaround && cd ~/azure-s3-workaround
nano main.tf

Create an S3-based configuration that mimics Azure Blob behavior: 

hcl
terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "~> 5.0"
    }
  }
  required_version = ">= 1.3.0"
}

provider "aws" {
  region                      = "us-east-1"
  access_key                  = "test"
  secret_key                  = "test"
  s3_use_path_style           = true
  skip_credentials_validation = true
  skip_requesting_account_id = true

  endpoints {
    s3 = "http://localhost:4566"
  }
}

resource "aws_s3_bucket" "blob_container" {
  bucket = "azure-simulated-container"
}

Apply the configuration:

bash
terraform init
terraform apply -auto-approve

✅ Success! You'll see: 

bash
aws_s3_bucket.blob_container: Creating...
aws_s3_bucket.blob_container: Creation complete after 1s [id=azure-simulated-container]

Apply complete! Resources: 1 added, 0 changed, 0 destroyed.


Understanding the Workaround

While this approach uses S3 APIs instead of native Azure APIs, it provides several benefits for local development:
  • Functional equivalence: For most blob storage operations, S3 and Azure Blob Storage have similar capabilities
  • Consistent testing: Your application logic can be tested without cloud dependencies
  • Cost-effective: No charges for local testing and development
  • Rapid iteration: Fast feedback loops during development

Conclusion

LocalStack's Azure support shows promise, but as alpha software, it comes with significant limitations—particularly when integrating with Infrastructure as Code tools like Terraform. The Azure Terraform provider's authentication requirements make it incompatible with LocalStack's current implementation.

However, this doesn't mean local Azure development is impossible. The S3-based workaround we've implemented provides a practical solution for testing blob storage functionality locally. While it's not a perfect 1:1 Azure emulation, it covers the most common use cases and provides a solid foundation for local development workflows.

Key takeaways:
  • LocalStack's Azure support is functional but limited in its current alpha state
  • Direct Terraform integration with Azure providers doesn't work with LocalStack
  • S3-based workarounds can effectively simulate Azure Blob Storage for local testing
  • Always have backup approaches when working with alpha-stage tools

As LocalStack's Azure support matures, we can expect better integration with Azure-native tools. Until then, hybrid approaches like the one demonstrated here provide a pragmatic path forward for teams wanting to incorporate local cloud testing into their development workflows.

For production deployments, remember to switch back to genuine Azure configurations and test thoroughly in staging environments that mirror your production setup.



* * *

Aaron Rose is a software engineer and technology writer.

Comments

Popular posts from this blog

The New ChatGPT Reason Feature: What It Is and Why You Should Use It

Raspberry Pi Connect vs. RealVNC: A Comprehensive Comparison

Running AI Models on Raspberry Pi 5 (8GB RAM): What Works and What Doesn't