If you're using Terraform to manage Azure infrastructure, you need remote state storage. Here's how to set it up properly with Azure Blob Storage.
Why Remote State?
By default, Terraform stores state locally in terraform.tfstate. This is fine for learning, but problematic for real work:
- No collaboration - Multiple people can't work on the same infrastructure
- No locking - Concurrent runs can corrupt state
- No versioning - Accidental deletions are permanent
- Security risk - State contains sensitive data that shouldn't be in git
Azure Blob Storage solves all of these.
Step 1: Create the Storage Account
First, create the storage account and container. I usually do this with the Azure CLI rather than Terraform (chicken and egg problem):
# Variables
RESOURCE_GROUP="rg-terraform-state"
STORAGE_ACCOUNT="stterraformstate$RANDOM"
CONTAINER_NAME="tfstate"
LOCATION="uksouth"
# Create resource group
az group create --name $RESOURCE_GROUP --location $LOCATION
# Create storage account
az storage account create \
--resource-group $RESOURCE_GROUP \
--name $STORAGE_ACCOUNT \
--sku Standard_LRS \
--encryption-services blob \
--min-tls-version TLS1_2
# Create blob container
az storage container create \
--name $CONTAINER_NAME \
--account-name $STORAGE_ACCOUNT
Step 2: Configure the Backend
In your Terraform configuration, add the backend block:
terraform {
backend "azurerm" {
resource_group_name = "rg-terraform-state"
storage_account_name = "stterraformstateXXXXX"
container_name = "tfstate"
key = "prod.terraform.tfstate"
}
}
The key is the blob name - use something descriptive like prod.terraform.tfstate or network.terraform.tfstate for different state files.
Step 3: Authentication
Terraform needs to authenticate to Azure. Best options:
For Local Development
Use Azure CLI authentication:
az login
Terraform will automatically use your CLI credentials.
For CI/CD Pipelines
Use a Service Principal:
az ad sp create-for-rbac --name "sp-terraform" --role contributor \
--scopes /subscriptions/YOUR_SUBSCRIPTION_ID
Then set environment variables:
export ARM_CLIENT_ID="xxx"
export ARM_CLIENT_SECRET="xxx"
export ARM_SUBSCRIPTION_ID="xxx"
export ARM_TENANT_ID="xxx"
For Azure DevOps
Use a Service Connection with Workload Identity Federation - no secrets to manage.
Step 4: Initialise
Run terraform init to migrate to the remote backend:
terraform init
If you have existing local state, Terraform will ask if you want to copy it to the new backend. Say yes.
Security Recommendations
Enable soft delete - Protects against accidental deletion:
az storage account blob-service-properties update \
--account-name $STORAGE_ACCOUNT \
--enable-delete-retention true \
--delete-retention-days 7
Enable versioning - Keeps history of state changes:
az storage account blob-service-properties update \
--account-name $STORAGE_ACCOUNT \
--enable-versioning true
Restrict access - Use RBAC instead of storage account keys:
# Grant your identity access
az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee YOUR_OBJECT_ID \
--scope /subscriptions/XXX/resourceGroups/rg-terraform-state/providers/Microsoft.Storage/storageAccounts/$STORAGE_ACCOUNT
Disable public access - If possible, use Private Endpoints.
State Locking
Azure Blob Storage supports native state locking through blob leases. Terraform handles this automatically - if someone else is running Terraform against the same state, you'll see:
Error: Error locking state: Error acquiring the state lock
This is a feature, not a bug. Wait for the other operation to complete.
Multiple Environments
Use different state files for different environments:
# In your prod configuration
terraform {
backend "azurerm" {
key = "prod.terraform.tfstate"
}
}
# In your dev configuration
terraform {
backend "azurerm" {
key = "dev.terraform.tfstate"
}
}
Or use workspaces, though I generally prefer separate state files for clarity.
Need help setting up your Infrastructure as Code pipeline? Get in touch - we help teams adopt Terraform and Azure DevOps.