After exporting your Azure cost data to a Storage Account, you can analyze it using Azure Synapse Analytics or Power BI. Choose your tool based on whether you prefer in-place querying, pipelines, or visualization dashboards.
π Option 1: Use Azure Synapse Analytics
π A. Query with Serverless SQL
- Ensure Hierarchical Namespace is enabled on your Storage Account (Data Lake Gen2).
- In Synapse Studio, use OPENROWSET or create external tables to query exported CSV/Parquet files directly.
Example query:
sqlCopyEditSELECT *
FROM OPENROWSET(
BULK 'https://<storage-account>.dfs.core.windows.net/<container>/cost-exports/*.csv',
FORMAT = 'CSV',
PARSER_VERSION = '2.0'
) AS rows
π B. Load Data via Synapse Pipeline
- Create a pipeline in Synapse or Azure Data Factory.
- Copy daily cost export files into:
- A dedicated SQL pool, or
- A Data Lake folder in Parquet format.
- Schedule the pipeline to run after your daily export.
π¬ C. Use Synapse Spark
- Open a Spark notebook in Synapse.
- Read the cost file into a DataFrame: pythonCopyEdit
df = spark.read.csv("abfss://<container>@<storage-account>.dfs.core.windows.net/cost-exports/") df.show()
- Use Spark for data transformation or join with other datasets.
π Access Requirements
- Assign Storage Blob Data Reader to your Synapse workspace managed identity on the Storage Account.
- Use wildcard paths if your export is partitioned across multiple files.
π Option 2: Use Power BI
β‘ A. Use the Azure Cost Management Connector (Recommended)
- In Power BI Desktop, go to:
- Get Data > Azure > Microsoft Cost Management.
- Choose your scope:
- Billing Account ID (for MCA)
- Enrollment Number (for EA)
- Sign in with Azure credentials and import your data.
β Result: Power BI connects directly to the Cost Management API β no storage export required.
ποΈ B. Import from Azure Storage
Use this method if you’re working with exported CSV/Parquet files.
For CSV:
- Use Azure Blob Storage or Azure Data Lake Gen2 connector.
- Provide the Storage URL or connect via SAS token.
- Use Power Query to combine and transform files.
For Parquet:
- Use Azure Data Lake Gen2 connector.
- If needed, install the Parquet connector for older Power BI versions.
π Refreshing Data in Power BI
- Schedule automatic data refreshes in Power BI Service.
- Ensure your dataset can access the storage (via gateway or direct cloud access).
π§ Summary
Tool | Approach | Notes |
---|---|---|
Synapse | Query in place or ingest via pipeline | Best for advanced analysis or joining with enterprise data |
Power BI | Direct API connector or import from Storage | Fastest setup with Cost Management connector for supported accounts |
Once connected, filter by subscription, resource group, service, or tag to identify cost trends.