Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

bigquery transfer - facebook ads backfill

I am trying to make a backfill of some information using a third-party tool because I can't currently do it natively in bigquery yet (can't I?). I am requesting data for before 2024-06-13 and it doesn't work (I think is the day that GCP made the tables partitioned)
 
https://cloud.google.com/bigquery/docs/facebook-ads-transfer

My question (if I can't do it from bigquery transfers...)

Where can I find the request that bigquery transfer uses to gather the information of the "AdInsights" table so that I can make a similar one-off request directly to meta ads API?
Solved Solved
0 1 785
1 ACCEPTED SOLUTION

Hi @tomlag you’re absolutely right that the BigQuery Data Transfer Service for Facebook Ads handles daily incremental loads effectively, but it has limitations when it comes to backfilling historical data, especially for dates prior to a major schema or partitioning update, such as the one on June 13, 2024.

Unfortunately, the BigQuery connector doesn’t provide access to the exact request it uses to pull data from the Meta Ads API. That said, we can still replicate the process manually.

Steps to Manually Fetch Historical Data Using the Meta Ads API

  1. Replicating the Ad Insights Table Request
    The AdInsights table in BigQuery is built using Meta’s Marketing Insights API. You can directly access the data through the following endpoint:

 

This endpoint lets you retrieve metrics, dimensions, and breakdowns similar to what BigQuery’s transfer service provides.

  1. Key Parameters to Include
    To align with BigQuery’s export, you’ll want to use these parameters:
  • level: Choose between Campaign, Ad Set, or Ad.
  • time_range: Define the historical date range for your backfill (e.g., {"since":"2024-06-01","until":"2024-06-12"}).
  • fields: Include metrics like impressions, clicks, spend, conversions, etc.
  • breakdowns: Add geographic or demographic breakdowns if needed.
  1. Example API Call

Here’s an example of how the request would look:

 

 
GET https://graph.facebook.com/v18.0/act_<AD_ACCOUNT_ID>/insights ?level=ad &fields=impressions,clicks,spend,ad_id,campaign_id &time_range={"since":"2024-06-01","until":"2024-06-12"} &access_token=<YOUR_ACCESS_TOKEN>
  1. Exporting the Data to BigQuery

Once you’ve successfully fetched the data from the Meta Ads API:

  1. Save the data in CSV or JSON format.
  2. Upload it to BigQuery using either:
    • The bq load command in the terminal.
    • The BigQuery web UI to manually import the file into your dataset.

Alternative Solution: Use Third-Party Tools

If manually querying the API and uploading data feels too time-consuming, tools like Windsor.ai can simplify the process. Windsor.ai offers:

  • Seamless backfilling of historical Facebook Ads data.
  • Native integration with both the Meta Ads API and BigQuery.
  • Automated data pipelines, eliminating the need for manual API requests or script maintenance.

This can save you significant time and effort, especially if you’re handling large amounts of historical data.

Key Takeaways

  • Use the Meta Ads API’s Ad Insights endpoint to replicate the historical data export.
  • Set the time_range, fields, and level parameters to target your desired metrics.
  • Upload the fetched data into BigQuery manually using CSV/JSON files.
  • If you prefer a more automated and hassle-free approach, third-party tools like Windsor.ai can handle the heavy lifting for you.

Hope this helps!

View solution in original post

1 REPLY 1

Hi @tomlag you’re absolutely right that the BigQuery Data Transfer Service for Facebook Ads handles daily incremental loads effectively, but it has limitations when it comes to backfilling historical data, especially for dates prior to a major schema or partitioning update, such as the one on June 13, 2024.

Unfortunately, the BigQuery connector doesn’t provide access to the exact request it uses to pull data from the Meta Ads API. That said, we can still replicate the process manually.

Steps to Manually Fetch Historical Data Using the Meta Ads API

  1. Replicating the Ad Insights Table Request
    The AdInsights table in BigQuery is built using Meta’s Marketing Insights API. You can directly access the data through the following endpoint:

 

This endpoint lets you retrieve metrics, dimensions, and breakdowns similar to what BigQuery’s transfer service provides.

  1. Key Parameters to Include
    To align with BigQuery’s export, you’ll want to use these parameters:
  • level: Choose between Campaign, Ad Set, or Ad.
  • time_range: Define the historical date range for your backfill (e.g., {"since":"2024-06-01","until":"2024-06-12"}).
  • fields: Include metrics like impressions, clicks, spend, conversions, etc.
  • breakdowns: Add geographic or demographic breakdowns if needed.
  1. Example API Call

Here’s an example of how the request would look:

 

 
GET https://graph.facebook.com/v18.0/act_<AD_ACCOUNT_ID>/insights ?level=ad &fields=impressions,clicks,spend,ad_id,campaign_id &time_range={"since":"2024-06-01","until":"2024-06-12"} &access_token=<YOUR_ACCESS_TOKEN>
  1. Exporting the Data to BigQuery

Once you’ve successfully fetched the data from the Meta Ads API:

  1. Save the data in CSV or JSON format.
  2. Upload it to BigQuery using either:
    • The bq load command in the terminal.
    • The BigQuery web UI to manually import the file into your dataset.

Alternative Solution: Use Third-Party Tools

If manually querying the API and uploading data feels too time-consuming, tools like Windsor.ai can simplify the process. Windsor.ai offers:

  • Seamless backfilling of historical Facebook Ads data.
  • Native integration with both the Meta Ads API and BigQuery.
  • Automated data pipelines, eliminating the need for manual API requests or script maintenance.

This can save you significant time and effort, especially if you’re handling large amounts of historical data.

Key Takeaways

  • Use the Meta Ads API’s Ad Insights endpoint to replicate the historical data export.
  • Set the time_range, fields, and level parameters to target your desired metrics.
  • Upload the fetched data into BigQuery manually using CSV/JSON files.
  • If you prefer a more automated and hassle-free approach, third-party tools like Windsor.ai can handle the heavy lifting for you.

Hope this helps!