The BigQuery Data Transfer Service API allows you to automate data loading from various sources including Google Cloud pricing data, scheduled queries, and third-party SaaS applications. This guide shows you how to enable and configure the API for billing data exports and automated data pipelines.
Prerequisites
Before you begin, ensure you have:
- A Google Cloud project (ideally your FinOps project)
- Owner or Editor role on the project
- Access to the Google Cloud Console or gcloud CLI installed
- BigQuery API already enabled in your project
- Basic understanding of data transfer concepts and use cases
Understanding BigQuery Data Transfer Service
The BigQuery Data Transfer Service enables:
- Automated data loads: Schedule recurring data imports without manual intervention
- Managed connectors: Pre-built integrations for Google services (Google Ads, YouTube, Cloud Storage, etc.)
- Pricing data exports: Import Google Cloud pricing information for cost forecasting
- Scheduled queries: Run SQL queries on a schedule and save results to tables
- Third-party data: Import from SaaS platforms like Salesforce, Amazon S3, Teradata
Common use cases for billing/FinOps:
- Import Cloud Pricing API data for rate card information
- Schedule daily cost aggregation queries
- Automate billing report generation
- Sync financial data from external systems
Step-by-Step Guide
Method 1: Using Google Cloud Console
Enable the API
-
Navigate to APIs & Services
- Open the Google Cloud Console
- Select your project (e.g., your FinOps project)
- Click the navigation menu (three horizontal lines)
- Go to APIs & Services > Library
-
Search for the API
- In the search bar, type:
BigQuery Data Transfer Service - Click on BigQuery Data Transfer API from the search results
- In the search bar, type:
-
Enable the API
- Click the ENABLE button
- Wait 10-30 seconds for the API to be enabled
- You'll see a confirmation message and the API dashboard
-
Verify Enablement
- Navigate to APIs & Services > Dashboard
- Look for "BigQuery Data Transfer API" in the list of enabled APIs
- You should see API metrics and usage information
Method 2: Using gcloud CLI
Enable the API with a Single Command
# Enable BigQuery Data Transfer Service API
gcloud services enable bigquerydatatransfer.googleapis.com \
--project=PROJECT_ID
Example:
# Enable in your FinOps project
gcloud services enable bigquerydatatransfer.googleapis.com \
--project=finops-billing-prod
Verify API is Enabled
# List all enabled APIs
gcloud services list --enabled \
--project=PROJECT_ID \
--filter="name:bigquerydatatransfer"
Expected output:
NAME TITLE
bigquerydatatransfer.googleapis.com BigQuery Data Transfer API
Check API Status
# Get detailed information about the API
gcloud services describe bigquerydatatransfer.googleapis.com \
--project=PROJECT_ID
Method 3: Using Terraform (Infrastructure as Code)
For automated infrastructure deployment:
# Enable BigQuery Data Transfer Service API
resource "google_project_service" "bigquery_data_transfer" {
project = "finops-billing-prod"
service = "bigquerydatatransfer.googleapis.com"
disable_on_destroy = false
}
# Ensure BigQuery API is also enabled (prerequisite)
resource "google_project_service" "bigquery" {
project = "finops-billing-prod"
service = "bigquery.googleapis.com"
disable_on_destroy = false
}
Apply with:
terraform init
terraform plan
terraform apply
Configuring Data Transfers
After enabling the API, set up data transfers for your use case:
Set Up Scheduled Queries
Scheduled queries run SQL automatically and save results to BigQuery tables.
Using Console
-
Navigate to BigQuery
- Go to BigQuery in the Cloud Console
- Click Scheduled queries in the left sidebar
-
Create Scheduled Query
- Click CREATE SCHEDULED QUERY
- Query: Enter your SQL query
- Schedule options:
- Repeats: Daily, Weekly, Monthly, or Custom
- Start time: When to run the first query
- Time zone: Select appropriate time zone
- Destination:
- Dataset: Select target dataset
- Table: Enter table name
- Write preference: Overwrite, Append, or Truncate
- Click SAVE
Example: Daily Cost Aggregation
-- Schedule this query to run daily at 2 AM
SELECT
DATE(usage_start_time) AS usage_date,
project.id AS project_id,
service.description AS service,
ROUND(SUM(cost), 2) AS daily_cost
FROM
`finops-billing-prod.billing_data.gcp_billing_export_v1_*`
WHERE
_TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
GROUP BY
usage_date,
project_id,
service
Save this to table: finops-billing-prod.billing_data.daily_cost_summary
Using gcloud CLI
# Create a scheduled query
bq query \
--use_legacy_sql=false \
--display_name='Daily Cost Summary' \
--schedule='every 24 hours' \
--destination_table='finops-billing-prod:billing_data.daily_cost_summary' \
--replace=true \
'SELECT
DATE(usage_start_time) AS usage_date,
project.id AS project_id,
service.description AS service,
ROUND(SUM(cost), 2) AS daily_cost
FROM
`finops-billing-prod.billing_data.gcp_billing_export_v1_*`
WHERE
_TABLE_SUFFIX = FORMAT_DATE("%Y%m%d", DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
GROUP BY
usage_date,
project_id,
service'
Set Up Cloud Pricing API Data Transfer
Import Google Cloud pricing data for cost forecasting:
Using Console
-
Navigate to Data Transfers
- Go to BigQuery > Data transfers
- Click CREATE TRANSFER
-
Select Source
- Source: Select Cloud Pricing API
- Click NEXT
-
Configure Transfer
- Display name: "Cloud Pricing Import"
- Schedule options: Daily recommended
- Destination dataset: Select dataset for pricing data
- Service account: Use default or specify custom
- Click SAVE
-
Run Initial Transfer
- Click RUN NOW to populate pricing data immediately
- Wait for transfer to complete (may take 5-10 minutes)
Configure IAM Permissions
Grant necessary permissions for data transfers:
# Grant BigQuery Data Transfer Service permission to write to BigQuery
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:service-PROJECT_NUMBER@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com" \
--role="roles/bigquery.dataEditor"
# Grant permission to read from Cloud Storage (if transferring from GCS)
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:service-PROJECT_NUMBER@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com" \
--role="roles/storage.objectViewer"
Best Practices
API Management
- Enable in dedicated project: Use your FinOps project to centralize data transfers
- Monitor API usage: Check quotas and usage in the API dashboard
- Document transfers: Maintain a list of all scheduled queries and data transfers
- Version control: Store query definitions in Git for version tracking
Scheduled Queries
- Use incremental processing: Query only new data with
_TABLE_SUFFIXfilters - Set appropriate schedules: Don't over-schedule; daily is often sufficient for billing data
- Optimize for cost: Write efficient queries to minimize BigQuery processing costs
- Handle failures: Set up email notifications for failed transfers
- Test before scheduling: Run queries manually first to validate results
Security
- Use service accounts: Don't use personal accounts for automated transfers
- Grant minimal permissions: Only grant roles necessary for the transfer
- Audit regularly: Review scheduled queries quarterly
- Enable logging: Track who creates/modifies scheduled queries
Cost Optimization
- Limit query scope: Use partitioning and clustering to reduce data scanned
- Choose off-peak times: Schedule during low-usage hours (e.g., 2-4 AM)
- Avoid duplicates: Use appropriate write preferences (truncate vs. append)
- Set quotas: Limit BigQuery slot usage for scheduled queries
Common Data Transfer Use Cases
Daily Billing Summary
-- Run daily at 3 AM to aggregate previous day's costs
SELECT
DATE(usage_start_time) AS usage_date,
project.name AS project_name,
service.description AS service,
sku.description AS sku,
ROUND(SUM(cost), 2) AS total_cost,
ROUND(SUM(IFNULL((SELECT SUM(c.amount) FROM UNNEST(credits) c), 0)), 2) AS total_credits
FROM
`PROJECT_ID.billing_data.gcp_billing_export_v1_*`
WHERE
_TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
GROUP BY
usage_date, project_name, service, sku
Monthly Cost Trend
-- Run on the 1st of each month to summarize previous month
SELECT
FORMAT_DATE('%Y-%m', DATE(usage_start_time)) AS month,
project.name AS project_name,
ROUND(SUM(cost), 2) AS monthly_cost,
COUNT(DISTINCT DATE(usage_start_time)) AS active_days
FROM
`PROJECT_ID.billing_data.gcp_billing_export_v1_*`
WHERE
_TABLE_SUFFIX BETWEEN FORMAT_DATE('%Y%m01', DATE_SUB(CURRENT_DATE(), INTERVAL 1 MONTH))
AND FORMAT_DATE('%Y%m%d', DATE_TRUNC(CURRENT_DATE(), MONTH) - 1)
GROUP BY
month, project_name
GKE Cost Attribution
-- Run daily to attribute GKE costs to namespaces
SELECT
DATE(usage_start_time) AS usage_date,
resource.labels.value AS cluster_name,
namespace_name,
ROUND(SUM(cost), 2) AS namespace_cost
FROM
`PROJECT_ID.gke_usage_metering.gke_cluster_resource_consumption` AS usage
JOIN
`PROJECT_ID.billing_data.gcp_billing_export_v1_*` AS billing
ON
usage.cluster_name = billing.resource.name
WHERE
billing._TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
AND usage.usage_start_time >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 DAY)
GROUP BY
usage_date, cluster_name, namespace_name
Troubleshooting
API Not Enabled Error
Problem: "BigQuery Data Transfer API has not been used in project" error
Solution:
- Verify API is enabled:
gcloud services list --enabled --filter="bigquerydatatransfer" - Enable the API:
gcloud services enable bigquerydatatransfer.googleapis.com - Wait 5-10 minutes for enablement to propagate
- Retry the operation
Permission Denied
Problem: "Permission denied" when creating scheduled query
Solution:
- Verify you have BigQuery Admin or BigQuery Data Editor role
- Check service account permissions if using custom service account
- Ensure BigQuery API is also enabled
- Grant
roles/bigquery.adminat project level
Scheduled Query Fails
Problem: Scheduled query runs but produces errors
Solution:
- Check query syntax in BigQuery editor first
- Verify source tables exist and are accessible
- Ensure destination dataset has sufficient quota
- Check service account has read/write permissions
- Review error logs in scheduled query history
Transfer Never Runs
Problem: Scheduled transfer shows "Pending" but never executes
Solution:
- Verify schedule is in the correct time zone
- Check that start time is not in the past
- Ensure Data Transfer Service has quota available
- Review project billing status (must be active)
- Check for API quota limits: Console > IAM & Admin > Quotas
High Costs
Problem: BigQuery Data Transfer consuming excessive budget
Solution:
- Review scheduled query efficiency: Add
LIMITclauses for testing - Use
_TABLE_SUFFIXto limit data scanned - Reduce transfer frequency (e.g., daily instead of hourly)
- Enable slot reservations for predictable costs
- Set up budget alerts for BigQuery spending
Next Steps
After enabling the BigQuery Data Transfer Service API:
- Create scheduled queries: Automate daily/weekly billing reports
- Import pricing data: Set up Cloud Pricing API transfer for rate cards
- Set up monitoring: Configure email notifications for transfer failures
- Build dashboards: Use transferred data in Looker Studio
- Optimize queries: Review and optimize scheduled queries for cost efficiency
- Document transfers: Maintain inventory of all data transfers and their purposes