Azure SQL's database copy feature is convenient - but it only works within the same subscription. For cross-subscription copies, you need different approaches.
The Problem
This works within a subscription:
CREATE DATABASE DevDB AS COPY OF ProductionDB
But there's no direct way to copy across subscriptions.
Option 1: BACPAC Export/Import
The most common approach - export to blob storage, import in the target subscription.
Export via Azure CLI
# Export from source subscription
az sql db export \
--resource-group rg-prod \
--server sql-prod \
--name ProductionDB \
--admin-user sqladmin \
--admin-password "$SQL_PASSWORD" \
--storage-key "$STORAGE_KEY" \
--storage-key-type StorageAccessKey \
--storage-uri "https://stbackups.blob.core.windows.net/bacpac/prod-$(date +%Y%m%d).bacpac"
Import in Target Subscription
# Import to target subscription
az account set --subscription "dev-subscription"
az sql db import \
--resource-group rg-dev \
--server sql-dev \
--name DevDB \
--admin-user sqladmin \
--admin-password "$SQL_PASSWORD" \
--storage-key "$STORAGE_KEY" \
--storage-key-type StorageAccessKey \
--storage-uri "https://stbackups.blob.core.windows.net/bacpac/prod-$(date +%Y%m%d).bacpac"
Limitations
- Time: Export/import can be slow for large databases
- Downtime: Source database continues to change during export
- Size limit: BACPAC files can be huge
Option 2: Azure Data Factory
For large databases or regular copies, ADF is more robust:
{
"name": "CopyAzureSqlDatabase",
"properties": {
"activities": [
{
"name": "CopyTables",
"type": "ForEach",
"inputs": [],
"outputs": [],
"typeProperties": {
"items": "@pipeline().parameters.tables",
"activities": [
{
"name": "CopyTable",
"type": "Copy",
"inputs": [
{
"referenceName": "SourceSqlDataset",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "SinkSqlDataset",
"type": "DatasetReference"
}
]
}
]
}
}
]
}
}
Benefits:
- Parallel table copying
- Retry logic
- Progress monitoring
- Can transform data during copy
Option 3: Database Migration Service
For one-time migrations with minimal downtime:
az dms project task create \
--resource-group rg-migration \
--service-name dms-migration \
--project-name sql-project \
--name migrate-prod-to-dev \
--source-connection-json '{"dataSource":"sql-prod.database.windows.net","authentication":"SqlAuthentication"}' \
--target-connection-json '{"dataSource":"sql-dev.database.windows.net","authentication":"SqlAuthentication"}' \
--database-options-json '{"name":"ProductionDB","targetDatabaseName":"DevDB"}'
Option 4: Geo-Replication Failover
A more complex but faster approach for large databases:
- Create geo-secondary in target subscription (requires special setup)
- Failover to secondary
- Break the link
This isn't directly supported but can work with Azure support assistance.
Automating with DevOps
Weekly refresh pipeline:
trigger: none
schedules:
- cron: "0 2 * * 0"
displayName: Weekly DB refresh
branches:
include:
- main
stages:
- stage: Export
jobs:
- job: ExportDB
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI@2
displayName: 'Export BACPAC'
inputs:
azureSubscription: 'prod-subscription'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az sql db export \
--resource-group rg-prod \
--server sql-prod \
--name ProductionDB \
--admin-user sqladmin \
--admin-password "$(sqlPassword)" \
--storage-key "$(storageKey)" \
--storage-key-type StorageAccessKey \
--storage-uri "https://stbackups.blob.core.windows.net/bacpac/weekly-$(Build.BuildId).bacpac"
- stage: Import
dependsOn: Export
jobs:
- job: ImportDB
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI@2
displayName: 'Import BACPAC'
inputs:
azureSubscription: 'dev-subscription'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
# Delete existing
az sql db delete --resource-group rg-dev --server sql-dev --name DevDB --yes || true
# Import new
az sql db import \
--resource-group rg-dev \
--server sql-dev \
--name DevDB \
--admin-user sqladmin \
--admin-password "$(sqlPassword)" \
--storage-key "$(storageKey)" \
--storage-key-type StorageAccessKey \
--storage-uri "https://stbackups.blob.core.windows.net/bacpac/weekly-$(Build.BuildId).bacpac"
Data Masking for Non-Production
Don't forget to mask sensitive data:
-- Run after import
UPDATE Customers SET
Email = CONCAT('user', Id, '@test.local'),
Phone = '0000000000'
UPDATE Payments SET
CardNumber = 'XXXX-XXXX-XXXX-0000'
Need help with database migration and synchronisation? Get in touch - we help organisations manage their data infrastructure.