Back to Blog
Azure
3 min read

Automating Azure SQL Database Backups for Dev/Test

Azure SQLAutomationDevOpsDatabase

Azure SQL's built-in backups are great for disaster recovery, but sometimes you need more control - weekly snapshots for testing, copies for development, or point-in-time restores on a schedule.

The Database Copy Approach

Azure SQL's database copy feature creates an exact replica:

CREATE DATABASE TestDB_Copy AS COPY OF SourceDB

This is transactionally consistent and doesn't impact the source database performance significantly.

Automating with Azure DevOps

Create a scheduled pipeline:

trigger: none

schedules:
- cron: "0 3 * * 0"  # 3am every Sunday
  displayName: Weekly DB Copy
  branches:
    include:
    - main

pool:
  vmImage: 'ubuntu-latest'

variables:
  sourceServer: 'sql-prod.database.windows.net'
  targetServer: 'sql-dev.database.windows.net'
  sourceDb: 'ProductionDB'
  targetDb: 'DevDB_$(Build.BuildNumber)'

steps:
- task: AzureCLI@2
  displayName: 'Copy Database'
  inputs:
    azureSubscription: 'your-service-connection'
    scriptType: 'bash'
    scriptLocation: 'inlineScript'
    inlineScript: |
      # Delete old copy if exists
      az sql db delete \
        --resource-group rg-dev \
        --server sql-dev \
        --name DevDB_Previous \
        --yes || true

      # Rename current to previous
      az sql db rename \
        --resource-group rg-dev \
        --server sql-dev \
        --name $(targetDb) \
        --new-name DevDB_Previous || true

      # Create new copy
      az sql db copy \
        --resource-group rg-prod \
        --server sql-prod \
        --name $(sourceDb) \
        --dest-resource-group rg-dev \
        --dest-server sql-dev \
        --dest-name $(targetDb) \
        --service-objective S2

Cross-Subscription Copies

If source and destination are in different subscriptions, you need BACPAC export/import:

steps:
- task: AzureCLI@2
  displayName: 'Export BACPAC'
  inputs:
    azureSubscription: 'prod-subscription'
    scriptType: 'bash'
    scriptLocation: 'inlineScript'
    inlineScript: |
      az sql db export \
        --resource-group rg-prod \
        --server sql-prod \
        --name ProductionDB \
        --admin-user sqladmin \
        --admin-password $(sqlPassword) \
        --storage-key $(storageKey) \
        --storage-key-type StorageAccessKey \
        --storage-uri "https://stbackups.blob.core.windows.net/bacpac/prod-$(Build.BuildNumber).bacpac"

- task: AzureCLI@2
  displayName: 'Import BACPAC'
  inputs:
    azureSubscription: 'dev-subscription'
    scriptType: 'bash'
    scriptLocation: 'inlineScript'
    inlineScript: |
      az sql db import \
        --resource-group rg-dev \
        --server sql-dev \
        --name DevDB \
        --admin-user sqladmin \
        --admin-password $(sqlPassword) \
        --storage-key $(storageKey) \
        --storage-key-type StorageAccessKey \
        --storage-uri "https://stbackups.blob.core.windows.net/bacpac/prod-$(Build.BuildNumber).bacpac"

Data Masking for Non-Production

Don't copy production data without masking sensitive information:

-- Run after copy in dev
UPDATE Customers SET
  Email = CONCAT('user', Id, '@test.local'),
  Phone = '0000000000',
  Address = 'Test Address'

UPDATE Orders SET
  CreditCardLast4 = '0000'

Or use Azure SQL's Dynamic Data Masking for read-time masking.

Azure Automation Alternative

For simpler scheduling without DevOps:

# Azure Automation Runbook
param(
  [string]$SourceServer = "sql-prod",
  [string]$SourceDb = "ProductionDB",
  [string]$TargetServer = "sql-dev",
  [string]$TargetDb = "DevDB"
)

Connect-AzAccount -Identity

$copyParams = @{
  ResourceGroupName = "rg-prod"
  ServerName = $SourceServer
  DatabaseName = $SourceDb
  CopyResourceGroupName = "rg-dev"
  CopyServerName = $TargetServer
  CopyDatabaseName = $TargetDb
}

New-AzSqlDatabaseCopy @copyParams

Schedule with Azure Automation schedules.

Cost Considerations

Database copies incur charges:

  • Compute while copy is running
  • Storage for the new database

For dev/test, consider:

  • Using smaller service objectives (S0/S1 instead of S3)
  • Deleting old copies after use
  • Using serverless compute tier

Need help automating your database workflows? Get in touch - we help teams build efficient DevOps processes.

Need help with your Azure environment?

Get in touch for a free consultation.

Get in Touch