Connect Facebook Ads to BigQuery — Full Guide (No Code)

Connect Facebook Ads to BigQuery — Full Guide (No Code)

If you want clean, queryable Facebook Ads data in BigQuery without writing a single line of code, this tutorial is for you. In this step‑by‑step guide, I walk you through setting up a Supermetrics Transfer from Facebook Ads into BigQuery, configuring refresh windows and schedules, and preparing the data so you can build robust Looker Studio reports or run SQL analyses. If you prefer the video walkthrough, watch the full tutorial here and use the embedded player below.

Step 1: Prepare your environment — BigQuery + Supermetrics

Before creating the transfer, make sure you have two things ready: a Google BigQuery project with a dataset where Supermetrics can write tables, and a Supermetrics account with Supermetrics Hub access. I assume you already created a Supermetrics folder in the Hub and connected your storage destination to BigQuery. If you haven’t, create a dataset in BigQuery and ensure the service account or credentials used by Supermetrics have permission to write tables (BigQuery Data Editor / BigQuery Job User per usual).

BigQuery dataset selected as storage destination in Supermetrics

On the Supermetrics side you’ll keep your workspace organized by using a dedicated folder for transfers. That folder will contain each transfer configuration and its group of tables. This makes it easy to find and manage multiple ad accounts, channels, or clients.

Step 2: Add or verify data sources in Supermetrics

Open the Supermetrics Hub and go to Data Sources. If you already connected Facebook Ads, you can skip this; otherwise, add the connection.

Supermetrics Hub showing connected data sources

Connecting a new source is straightforward: click connect, follow the authorization prompt, and grant Supermetrics access. For example, when I connected Google Analytics 4, I clicked Connect, completed the auth pane, and selected the GA4 property for that account. The same pattern applies to Facebook Ads: authenticate with your Facebook login and grant permissions for the ad accounts you want to transfer.

Tip: keep only the accounts you actively use connected. Too many connected accounts can make selecting the right account for a transfer confusing, especially if you manage multiple clients.

Step 3: Create a new transfer — select Facebook Ads to BigQuery

Now let’s create the transfer. In the Supermetrics Hub choose the Transfers section and click to create a new transfer. Pick Facebook Ads as the source and BigQuery as the destination.

Creating a new transfer in Supermetrics: selecting Facebook Ads and BigQuery

On the account selection screen, choose the specific Facebook ad account you want to export. If you have many ad accounts, make sure you select the correct one (I typically keep a small test account for demos, which is perfect for tutorials and experimentation).

Step 4: Select tables and fields — use predefined table groups

This is where Supermetrics saves you time. Instead of picking individual metrics and dimensions from the Facebook API, Supermetrics offers predefined table groups. I normally choose the standard tables: ads, videos, campaign, conversion, gender, and age. These cover most reporting needs out of the box.

Table group selection: predefined Facebook Ads tables for export

You can still add filters or inspect which fields will be pulled for each table. Reviewing the schema is helpful if you’ll be joining tables in BigQuery or need specific columns for Looker Studio. I usually leave the schema unchanged unless a client has custom needs.

Step 5: Choose the refresh method and the refresh window

Deciding how Supermetrics updates your tables is important both for data accuracy and cost control. Supermetrics offers different refresh methods and a “refresh window” concept. The window determines how many past days Supermetrics rewrites on each update.

My default approach is to use incremental refresh with a two‑day refresh window. That means: on each scheduled run, Supermetrics pulls new rows and rewrites the last two days of data. This covers late-arriving attribution updates without reprocessing large historical volumes every time.

Configuring the refresh method and refresh window in Supermetrics

If your data source is known to backfill historical conversions frequently (some mobile attribution platforms do this), choose a full refresh or extend the refresh window to capture those changes. Conversely, on large accounts where BigQuery costs matter, keep the window short to reduce the data processed.

Step 6: Attribution window, filters, and cleanup options

Facebook’s attribution window affects conversion counts. Supermetrics exposes this setting in the transfer so you can match the attribution logic you use in reporting. I usually keep Facebook’s default attribution window unless a client explicitly tracks conversions using a different window. Mismatched attribution windows are a common source of discrepancies between Facebook and your downstream reports, so double-check this.

Attribution window setting in Supermetrics transfer

Other useful options include:

  • Exclude ads and ad sets with zero impressions: Recommended for large accounts to reduce table size and speed up queries.
  • Include deleted ads: Decide whether you want historical records for deleted objects — I typically exclude them to keep datasets tidy.
  • Filters: Add campaign or date filters if you only need a subset of data.

Step 7: Name the transfer and choose backfill behavior

Give the transfer a clear name that helps you identify the account and purpose later (for example, “Facebook Ads — Client ABC — Reports”). You can choose to run the transfer immediately (a backfill) or only from the time of saving forward. For first-time setups, I usually run a full backfill for a reasonable historical range — a few months, or from the start of the current year — depending on needs and cost considerations.

Naming transfer and running a backfill after saving

Running a full backfill pulls all historical rows and creates the initial tables in BigQuery. Expect this to take longer than daily incremental refreshes, especially for big accounts.

Step 8: Schedule daily runs and alerting

Set up your transfer schedule — I recommend daily runs at a time when your reporting refresh is needed (for example, 01:00 Europe/Lisbon if you want fresh data before the workday starts). Supermetrics allows email or Slack alerting if transfers fail. Configure alerts so you’re notified of any broken transfers promptly.

Scheduling daily runs and connecting Slack/email alerts

Scheduling daily incremental runs with a short refresh window gives a balance: near real‑time reporting with limited BigQuery costs and safe backfilling of late data.

Step 9: Monitor the transfer and validate data in BigQuery

After saving and running the transfer, Supermetrics will show the transfer in your folder. Head to BigQuery to check the newly created dataset and tables. Supermetrics creates a group of tables (one per predefined table) that you can query directly.

Supermetrics transfer running and tables appearing in BigQuery

Validate a few key metrics against Facebook Ads Manager for a recent date range to confirm attribution and filters are configured correctly. Once you’re happy, connect those tables to Looker Studio or run SQL to join with other data sources.

Step 10: Use your BigQuery tables for reporting and analysis

With Facebook Ads data in BigQuery you can:

  • Build Looker Studio reports using BigQuery as a data source for faster, more flexible visuals.
  • Write SQL to aggregate, join and transform ad data (e.g., join campaigns with Google Search Console or GA4 data).
  • Create scheduled queries in BigQuery to prepare star/snapshot tables for faster dashboarding and reduced cost.

For example, you might write a query that pulls campaign performance for the last 30 days, joins creative metadata, and computes CPA by conversion type — then store the result as a table refreshed daily.

Querying Supermetrics tables in BigQuery to filter by date

Practical tip

Keep a consistent naming convention for transfers and datasets. When you manage multiple clients or ad channels, predictable names save time and reduce errors in Looker Studio or when writing queries.

Summary — what you’ll get from this setup

Once you finish the setup, you’ll have:

  • Daily Facebook Ads data is landing in BigQuery in a structured set of tables.
  • A reliable refresh window that covers late-arriving attribution updates without reprocessing unnecessary history.
  • The ability to run advanced SQL, join ad data with other sources, and build professional Looker Studio dashboards.
  • Automated scheduling and alerting so you don’t miss failed transfers.

This setup moves you from manual CSV exports to a scalable, automated pipeline that’s much easier to trust and maintain. It’s especially useful for analysts and marketers who want a single source of truth for ad performance.

Further reading and tutorials

If you want to deepen your skills or see related workflows, check these posts on the blog:

These posts include templates, example queries, and dashboard ideas you can reuse with the Facebook Ads tables you’ve just created.

Final recommendations

Start small and iterate. Export a single ad account and a handful of tables first, validate numbers against Ads Manager, and then expand the scope. Use a short refresh window to control costs and only increase it if you need to capture late backfills. Finally, add alerting to your workflow so you know if transfers fail and can act quickly.

If you followed this guide, you now have a stable pipeline from Facebook Ads into BigQuery and the freedom to build advanced, reusable reports. Enjoy querying your ad data!