How to Enable Cost Management Export to Azure Storage
Azure Cost Management exports enable you to automatically deliver detailed cost and usage data to Azure Storage on a scheduled basis. This guide shows you how to set up automated cost exports for advanced analysis in Excel, Power BI, or custom applications, giving you full control over your billing data.
Overview
While the Azure Portal's Cost Analysis provides powerful visualization tools, many organizations need raw cost data for:
- Custom reporting and dashboards in Power BI or Tableau
- Financial reconciliation with enterprise accounting systems
- Long-term cost trend analysis beyond portal retention limits
- Automated alerting and anomaly detection using custom scripts
- Audit compliance requiring historical cost data archives
- Cross-cloud cost comparison when running multi-cloud environments
Azure Cost Management exports solve this by automatically delivering CSV or Parquet files containing granular cost data to Azure Blob Storage daily, weekly, or monthly. You can export actual costs, amortized costs, or usage data with full tag and metadata.
This guide covers:
- Creating exports via Azure Portal
- Automating exports with Azure CLI
- Advanced configurations with PowerShell and Bicep
- Integrating exports with downstream systems
Prerequisites
Before you begin, ensure you have:
- Cost Management Contributor or Owner role at the scope where you want to create the export (subscription, resource group, or billing account)
- An existing Azure Storage Account (Standard general-purpose v2 recommended)
- Storage Blob Data Contributor role on the storage account (for export service identity)
- Azure CLI 2.30.0+ installed and authenticated (
az login
) - PowerShell 7+ with Az.CostManagement module (for PowerShell method)
- Basic understanding of Azure scopes (subscription vs resource group vs billing account)
Verify Your Permissions
# Check your Cost Management permissions
az role assignment list \
--assignee $(az account show --query user.name -o tsv) \
--scope "/subscriptions/$(az account show --query id -o tsv)" \
--query "[?contains(roleDefinitionName, 'Cost')].{Role:roleDefinitionName, Scope:scope}" -o table
# Verify storage account access
az storage account show \
--name "your-storage-account-name" \
--query "{Name:name, Location:location, Sku:sku.name}" -o table
Understanding Export Types and Scopes
Before creating an export, understand the different options:
Export Types
- Actual Cost - Charges as they appear on your invoice, including purchases and refunds
- Amortized Cost - Spreads reservation and savings plan costs across usage period
- Usage - Raw consumption metrics without pricing (meters, quantities)
Export Scopes
- Subscription - Costs for a single subscription
- Resource Group - Costs for specific resource group
- Management Group - Aggregated costs across multiple subscriptions
- Billing Account (EA/MCA) - Enterprise-wide costs across all subscriptions
File Formats
- CSV - Human-readable, compatible with Excel and most tools
- Parquet - Compressed columnar format, efficient for large datasets and analytics tools
Method 1: Create Cost Export (Azure Portal)
The Azure Portal provides a user-friendly interface for setting up cost exports.
Step 1: Navigate to Cost Exports
- Sign in to the Azure Portal
- Navigate to Cost Management + Billing
- Select the appropriate scope:
- For subscription scope: Select Subscriptions > Choose subscription > Cost Management > Exports
- For billing account scope: Select Billing scopes > Choose billing account > Cost Management > Exports
- Click + Add to create a new export
Step 2: Configure Export Basics
- Export name: Enter a descriptive name (e.g.,
monthly-actual-costs-prod
) - Export type: Choose one:
- Actual cost (Usage and Purchases) - Recommended for most use cases
- Amortized cost (Usage and Purchases) - For reservation cost spreading
- Usage Details - Meters and quantities only
- Metric: Select Actual Cost or Amortized Cost
- Export format: Choose CSV or Parquet
- CSV: Better for Excel and general use
- Parquet: Better for large datasets and analytics platforms
Step 3: Configure Storage Destination
- Storage account: Select an existing storage account or create new
- Recommendation: Use a dedicated storage account for cost data
- Location: Same region as majority of your resources to minimize egress costs
- Container: Enter container name (will be created if doesn't exist)
- Example:
cost-exports
- Example:
- Directory: (Optional) Specify subdirectory path
- Example:
production/monthly
for organization
- Example:
Step 4: Configure Schedule and Timeframe
- Run frequency:
- Daily - Export yesterday's costs each day
- Weekly - Export last week's costs every Monday
- Monthly - Export last month's costs on the 1st
- Start date: When to begin exporting
- Timeframe:
- Month to date - Current month's costs (refreshed daily for daily exports)
- Last month - Previous calendar month
- Last 7 days - Rolling 7-day window
- Custom date range - Specific historical period
Recommendation: Use Daily export with Month to date timeframe for near real-time cost tracking.
Step 5: Review and Create
- Review the configuration summary
- Click Create
- The export will run according to the schedule (first run within 24 hours)
Step 6: Verify Export Files
After the first export runs:
- Navigate to your storage account in Azure Portal
- Select Containers > Your export container
- Browse the folder structure:
cost-exports/ └── monthly-actual-costs-prod/ └── 20250115-20250115/ └── monthly-actual-costs-prod_12345678-1234-1234-1234-123456789012.csv
- Download and inspect the CSV file
Method 2: Create Cost Export (Azure CLI)
Automate export creation using Azure CLI for infrastructure-as-code workflows.
Step 1: Define Export Variables
# Define variables
SUBSCRIPTION_ID=$(az account show --query id -o tsv)
EXPORT_NAME="daily-actual-costs"
STORAGE_ACCOUNT_NAME="costdatastorage"
CONTAINER_NAME="cost-exports"
RESOURCE_GROUP="cost-management-rg"
# Create resource group if needed
az group create \
--name $RESOURCE_GROUP \
--location eastus
Step 2: Create Storage Account and Container
# Create storage account
az storage account create \
--name $STORAGE_ACCOUNT_NAME \
--resource-group $RESOURCE_GROUP \
--location eastus \
--sku Standard_LRS \
--kind StorageV2 \
--access-tier Hot
# Get storage account ID
STORAGE_ACCOUNT_ID=$(az storage account show \
--name $STORAGE_ACCOUNT_NAME \
--resource-group $RESOURCE_GROUP \
--query id -o tsv)
# Create container (requires storage account key or SAS)
az storage container create \
--name $CONTAINER_NAME \
--account-name $STORAGE_ACCOUNT_NAME \
--auth-mode login
Step 3: Create Cost Export
# Create daily export of actual costs
az costmanagement export create \
--name $EXPORT_NAME \
--type "ActualCost" \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--storage-account-id "$STORAGE_ACCOUNT_ID" \
--storage-container "$CONTAINER_NAME" \
--storage-directory "exports/daily" \
--timeframe "MonthToDate" \
--recurrence "Daily" \
--recurrence-period-from "2025-01-01T00:00:00Z" \
--recurrence-period-to "2025-12-31T23:59:59Z" \
--schedule-status "Active" \
--format "Csv"
Step 4: Verify Export Creation
# List all exports at subscription scope
az costmanagement export list \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--query "[].{Name:name, Type:type, Status:schedule.status}" -o table
# Show export details
az costmanagement export show \
--name $EXPORT_NAME \
--scope "/subscriptions/$SUBSCRIPTION_ID"
Step 5: Manually Run Export (Optional)
# Trigger immediate export run (doesn't wait for schedule)
az costmanagement export execute \
--export-name $EXPORT_NAME \
--scope "/subscriptions/$SUBSCRIPTION_ID"
Method 3: Create Cost Export (PowerShell)
Use PowerShell for advanced automation and Windows-based environments.
Step 1: Install Required Modules
# Install Az.CostManagement module
Install-Module -Name Az.CostManagement -Repository PSGallery -Force
# Install Az.Storage for storage account operations
Install-Module -Name Az.Storage -Repository PSGallery -Force
# Connect to Azure
Connect-AzAccount
Step 2: Configure Export Parameters
# Define parameters
$subscriptionId = (Get-AzContext).Subscription.Id
$exportName = "weekly-amortized-costs"
$resourceGroup = "cost-management-rg"
$storageAccountName = "costdatastorage"
$containerName = "cost-exports"
$location = "eastus"
# Create storage account
$storageAccount = New-AzStorageAccount `
-ResourceGroupName $resourceGroup `
-Name $storageAccountName `
-Location $location `
-SkuName Standard_LRS `
-Kind StorageV2
# Create container
$ctx = $storageAccount.Context
New-AzStorageContainer `
-Name $containerName `
-Context $ctx `
-Permission Off
Step 3: Create Export Configuration
# Build export definition
$exportDefinition = @{
Type = "ActualCost"
Timeframe = "WeekToDate"
Format = "Csv"
}
# Build delivery info
$deliveryInfo = @{
Destination = @{
ResourceId = $storageAccount.Id
Container = $containerName
RootFolderPath = "weekly-exports"
}
}
# Build schedule
$schedule = @{
Status = "Active"
Recurrence = "Weekly"
RecurrencePeriod = @{
From = "2025-01-01T00:00:00Z"
To = "2025-12-31T23:59:59Z"
}
}
# Create export
New-AzCostManagementExport `
-Scope "/subscriptions/$subscriptionId" `
-Name $exportName `
-DefinitionType $exportDefinition.Type `
-DefinitionTimeframe $exportDefinition.Timeframe `
-DestinationResourceId $storageAccount.Id `
-DestinationContainer $containerName `
-DestinationRootFolderPath "weekly-exports" `
-Format $exportDefinition.Format `
-ScheduleStatus $schedule.Status `
-ScheduleRecurrence $schedule.Recurrence `
-RecurrencePeriodFrom $schedule.RecurrencePeriod.From `
-RecurrencePeriodTo $schedule.RecurrencePeriod.To
Step 4: Verify and Manage Exports
# List all exports
Get-AzCostManagementExport -Scope "/subscriptions/$subscriptionId"
# Get specific export
$export = Get-AzCostManagementExport `
-Scope "/subscriptions/$subscriptionId" `
-Name $exportName
# Display export configuration
$export | Format-List
# Update export (e.g., change schedule)
Update-AzCostManagementExport `
-Scope "/subscriptions/$subscriptionId" `
-Name $exportName `
-ScheduleStatus "Inactive"
# Delete export
Remove-AzCostManagementExport `
-Scope "/subscriptions/$subscriptionId" `
-Name $exportName
Method 4: Deploy Cost Export (Bicep/ARM)
Use infrastructure-as-code for repeatable export deployments.
Step 1: Create Bicep Template
// cost-export.bicep
param exportName string
param storageAccountId string
param containerName string
param exportScope string = subscription().id
param recurrence string = 'Daily'
param format string = 'Csv'
@description('Start date for export schedule')
param recurrencePeriodFrom string = '2025-01-01T00:00:00Z'
@description('End date for export schedule')
param recurrencePeriodTo string = '2025-12-31T23:59:59Z'
resource costExport 'Microsoft.CostManagement/exports@2023-03-01' = {
name: exportName
scope: subscription()
properties: {
schedule: {
status: 'Active'
recurrence: recurrence
recurrencePeriod: {
from: recurrencePeriodFrom
to: recurrencePeriodTo
}
}
format: format
deliveryInfo: {
destination: {
resourceId: storageAccountId
container: containerName
rootFolderPath: exportName
}
}
definition: {
type: 'ActualCost'
timeframe: 'MonthToDate'
dataSet: {
granularity: 'Daily'
configuration: {
columns: [
'Date'
'ResourceId'
'ResourceType'
'ResourceLocation'
'Tags'
'CostInBillingCurrency'
'BillingCurrency'
]
}
}
}
}
}
output exportId string = costExport.id
output exportName string = costExport.name
Step 2: Create Parameters File
// cost-export.parameters.json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"exportName": {
"value": "daily-actual-costs-bicep"
},
"storageAccountId": {
"value": "/subscriptions/YOUR-SUB-ID/resourceGroups/cost-management-rg/providers/Microsoft.Storage/storageAccounts/costdatastorage"
},
"containerName": {
"value": "cost-exports"
},
"recurrence": {
"value": "Daily"
},
"format": {
"value": "Csv"
}
}
}
Step 3: Deploy the Template
# Deploy using Azure CLI
az deployment sub create \
--location eastus \
--template-file cost-export.bicep \
--parameters cost-export.parameters.json
# Verify deployment
az deployment sub show \
--name cost-export \
--query "properties.outputs"
Advanced Configurations
Export with Custom Columns
Include specific columns for detailed analysis:
# Create export with custom column selection
az costmanagement export create \
--name "detailed-cost-export" \
--type "ActualCost" \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--storage-account-id "$STORAGE_ACCOUNT_ID" \
--storage-container "$CONTAINER_NAME" \
--timeframe "MonthToDate" \
--recurrence "Daily" \
--recurrence-period-from "2025-01-01T00:00:00Z" \
--recurrence-period-to "2025-12-31T23:59:59Z" \
--schedule-status "Active" \
--format "Csv" \
--definition '{
"type": "ActualCost",
"timeframe": "MonthToDate",
"dataSet": {
"granularity": "Daily",
"configuration": {
"columns": [
"Date",
"ResourceId",
"ResourceType",
"ResourceGroupName",
"ResourceLocation",
"MeterCategory",
"MeterSubCategory",
"Meter",
"Tags",
"UnitOfMeasure",
"Quantity",
"EffectivePrice",
"CostInBillingCurrency",
"BillingCurrency"
]
}
}
}'
Export with Tag Filters
Export costs for specific tagged resources:
# Export only resources with Environment=Production tag
az costmanagement export create \
--name "production-costs-only" \
--type "ActualCost" \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--storage-account-id "$STORAGE_ACCOUNT_ID" \
--storage-container "$CONTAINER_NAME" \
--timeframe "MonthToDate" \
--recurrence "Daily" \
--recurrence-period-from "2025-01-01T00:00:00Z" \
--recurrence-period-to "2025-12-31T23:59:59Z" \
--schedule-status "Active" \
--format "Csv" \
--definition '{
"type": "ActualCost",
"timeframe": "MonthToDate",
"dataSet": {
"granularity": "Daily",
"filter": {
"tags": {
"name": "Environment",
"operator": "In",
"values": ["Production"]
}
}
}
}'
Multiple Exports for Different Scopes
Create organized export structure:
# Function to create scoped export
create_scoped_export() {
local scope_name=$1
local scope_id=$2
az costmanagement export create \
--name "${scope_name}-daily-costs" \
--type "ActualCost" \
--scope "$scope_id" \
--storage-account-id "$STORAGE_ACCOUNT_ID" \
--storage-container "$CONTAINER_NAME" \
--storage-directory "$scope_name" \
--timeframe "MonthToDate" \
--recurrence "Daily" \
--recurrence-period-from "2025-01-01T00:00:00Z" \
--recurrence-period-to "2025-12-31T23:59:59Z" \
--schedule-status "Active" \
--format "Csv"
}
# Create exports for different subscriptions
create_scoped_export "production" "/subscriptions/prod-sub-id"
create_scoped_export "development" "/subscriptions/dev-sub-id"
create_scoped_export "testing" "/subscriptions/test-sub-id"
Best Practices
1. Organize Storage Structure
Use a consistent folder hierarchy:
cost-exports/
├── daily/
│ ├── actual/
│ └── amortized/
├── weekly/
│ └── actual/
├── monthly/
│ └── actual/
└── historical/
└── one-time-exports/
2. Set Appropriate Retention Policies
Configure lifecycle management to control storage costs:
# Create lifecycle management rule
az storage account management-policy create \
--account-name $STORAGE_ACCOUNT_NAME \
--policy '{
"rules": [{
"name": "MoveOldExportsToCool",
"enabled": true,
"type": "Lifecycle",
"definition": {
"filters": {
"blobTypes": ["blockBlob"],
"prefixMatch": ["cost-exports/"]
},
"actions": {
"baseBlob": {
"tierToCool": {"daysAfterModificationGreaterThan": 30},
"tierToArchive": {"daysAfterModificationGreaterThan": 90},
"delete": {"daysAfterModificationGreaterThan": 365}
}
}
}
}]
}'
3. Enable Storage Redundancy for Critical Data
# Update storage account to GRS for geographic redundancy
az storage account update \
--name $STORAGE_ACCOUNT_NAME \
--resource-group $RESOURCE_GROUP \
--sku Standard_GRS
4. Implement Access Controls
Use Azure RBAC and SAS tokens for secure access:
# Grant specific user access to read exports
az role assignment create \
--assignee "[email protected]" \
--role "Storage Blob Data Reader" \
--scope "/subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Storage/storageAccounts/$STORAGE_ACCOUNT_NAME"
# Generate time-limited SAS token for external tool access
az storage container generate-sas \
--account-name $STORAGE_ACCOUNT_NAME \
--name $CONTAINER_NAME \
--permissions rl \
--expiry "2025-02-01T00:00:00Z" \
--output tsv
5. Monitor Export Execution
Create alerts for failed exports:
# Create action group for notifications
az monitor action-group create \
--name "cost-export-alerts" \
--resource-group $RESOURCE_GROUP \
--short-name "CostAlerts" \
--email-receiver name="FinanceTeam" email="[email protected]"
# Create alert rule for export failures
# Note: Cost Management doesn't expose metrics, monitor storage instead
az monitor metrics alert create \
--name "export-file-not-created" \
--resource-group $RESOURCE_GROUP \
--scopes "/subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Storage/storageAccounts/$STORAGE_ACCOUNT_NAME" \
--condition "total BlobCount < 1" \
--window-size 24h \
--evaluation-frequency 24h \
--action "cost-export-alerts"
6. Document Export Configurations
Maintain an inventory of all exports:
# Generate export inventory report
az costmanagement export list \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--query "[].{Name:name, Type:definition.type, Recurrence:schedule.recurrence, Status:schedule.status, Container:deliveryInfo.destination.container}" \
--output table > cost-export-inventory.txt
7. Use Parquet for Large Datasets
For exports over 1GB, use Parquet format:
# Create Parquet export for better compression
az costmanagement export create \
--name "large-dataset-parquet" \
--type "ActualCost" \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--storage-account-id "$STORAGE_ACCOUNT_ID" \
--storage-container "$CONTAINER_NAME" \
--timeframe "MonthToDate" \
--recurrence "Daily" \
--recurrence-period-from "2025-01-01T00:00:00Z" \
--recurrence-period-to "2025-12-31T23:59:59Z" \
--schedule-status "Active" \
--format "Parquet" # Parquet instead of Csv
Troubleshooting
Issue: Export Not Running on Schedule
Symptoms: Export created but no files appearing in storage
Solution:
- Verify export status is "Active"
- Check recurrence period includes current date
- Verify storage account permissions
- Wait 24-48 hours for first run
# Check export status
az costmanagement export show \
--name $EXPORT_NAME \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--query "{Status:schedule.status, From:schedule.recurrencePeriod.from, To:schedule.recurrencePeriod.to}"
# Check last run history (via REST API)
az rest --method GET \
--uri "https://management.azure.com/subscriptions/$SUBSCRIPTION_ID/providers/Microsoft.CostManagement/exports/$EXPORT_NAME/runHistory?api-version=2023-03-01"
Issue: Access Denied Error
Symptoms: Export creation fails with authorization error
Solution:
- Verify you have "Cost Management Contributor" role
- Check storage account RBAC permissions
- Ensure storage account is in same tenant
# Grant Cost Management access to storage account
EXPORT_PRINCIPAL_ID="00000000-0000-0000-0000-000000000000" # Cost Management service principal
az role assignment create \
--assignee $EXPORT_PRINCIPAL_ID \
--role "Storage Blob Data Contributor" \
--scope "$STORAGE_ACCOUNT_ID"
Issue: Incomplete or Missing Data in Exports
Symptoms: Export files contain partial data or missing resources
Solution:
- Verify export scope matches expected resources
- Check tag filters aren't excluding resources
- Wait for cost data to finalize (can take 24-72 hours)
- Review column configuration
# Verify export definition
az costmanagement export show \
--name $EXPORT_NAME \
--scope "/subscriptions/$SUBSCRIPTION_ID" \
--query "definition"
Issue: Storage Costs Growing Unexpectedly
Symptoms: Storage account costs increasing rapidly
Solution:
- Implement lifecycle management policies
- Delete old exports no longer needed
- Use Archive tier for long-term retention
- Consider Parquet format for compression
# Check storage usage
az storage blob list \
--account-name $STORAGE_ACCOUNT_NAME \
--container-name $CONTAINER_NAME \
--query "[].{Name:name, Size:properties.contentLength}" \
--output table
# Calculate total size
az storage blob list \
--account-name $STORAGE_ACCOUNT_NAME \
--container-name $CONTAINER_NAME \
--query "sum([].properties.contentLength)" \
--output tsv
Issue: Export Files Not Readable in Excel
Symptoms: CSV file won't open or displays incorrectly
Solution:
- Check file encoding (should be UTF-8)
- Use Power Query in Excel for large files
- Split large exports by date range
- Use specialized tools (Power BI, Azure Data Explorer) for files >100MB
# Download export file
az storage blob download \
--account-name $STORAGE_ACCOUNT_NAME \
--container-name $CONTAINER_NAME \
--name "cost-exports/export-file.csv" \
--file "local-export.csv"
# Check file size
ls -lh local-export.csv
Next Steps
After setting up cost exports:
-
Integrate with Power BI: Create automated dashboards from export data
- Connect Power BI to Azure Blob Storage
- Set up scheduled refreshes
- Build cost trend visualizations
-
Automate Cost Analysis: Build custom scripts to analyze exports
- Python/pandas for data manipulation
- Azure Functions for automated processing
- Anomaly detection for unusual cost patterns
-
Set Up Cost Alerts: Use exports to trigger custom alerts
-
Enable Cost Allocation: Distribute shared costs to teams
-
Archive Historical Data: Implement long-term cost data retention
- Move exports to Archive tier after 90 days
- Implement automated backup to secondary storage
- Consider Azure Data Lake for petabyte-scale analysis
Related Resources
Frequently Asked Questions
Find answers to common questions
Need Professional Help?
Our team of experts can help you implement and configure these solutions for your organization.