How to Send Instagram Public Data to BigQuery
I often need quick, reliable access to social media data for marketing analysis. Pulling Instagram public data into BigQuery unlocks the ability to join post-level metrics with other data sources, run SQL queries, and build richer Looker Studio dashboards. Below, I explain how I set up Instagram public data ingestion with Windsor.ai, the fields I pull, and practical tips to make the dataset useful for ongoing analysis. If you prefer a visual walk-through, here is the tutorial link:
Step 1: Choose the right connector in Windsor.ai
I use Windsor.ai because it supports hundreds of marketing connectors and destinations, which makes it easy to centralize data into BigQuery without writing API code. Start by searching the Windsor.ai connectors list for public Instagram data.
Step 2: Grant access to Instagram public data
When you select the Instagram public connector, you will need to grant Windsor.ai access through your Facebook login. This step authorizes Windsor.ai to read public account and post-level information. Only grant the permissions required for public data access.

Step 3: Select the fields you want to ingest
Windsor.ai shows a list of available fields for Instagram public. I recommend focusing on the fields that power analysis and dashboards:
- Post-level metrics: likes count, comments count, media caption, media type, media link
- Account-level metadata: account name, biography, follower count, website
- Post metadata: created date, product type
Pick the fields you need and avoid unnecessary columns to keep the table tidy and storage-efficient.

Step 4: Choose a date window and filters
Decide how much historical data you want to backfill. For quick checks I use the last three days. For initial setup, backfilling one year gives more analysis-ready history.
- Set the date range in Windsor.ai when configuring the connector.
- If you plan to overwrite recent records, use a column match on the date field so repeated syncs update the same rows rather than duplicating.
- If you only want certain accounts or media types, apply filters at this step.

Step 5: Configure BigQuery as the destination
Point Windsor.ai to your Google Project and BigQuery dataset. I recommend:
- Using a dataset named clearly for Windsor.ai tables (for example, winstor or windsor_tables).
- Naming the table after the task or connector. Consistent naming simplifies queries and automation.
- Matching a date column so incremental runs can update recent days instead of creating duplicates.

Step 6: Schedule syncs and backfill
Schedule daily updates at a low-traffic hour (I use 3 a.m.). Windsor.ai supports backfilling — run a one-year backfill during setup to populate historical data, then rely on daily increments going forward. Confirm the column-to-match option is set correctly so the daily job overwrites or appends as intended.

Step 7: Monitor backfill and validate data
After saving the task, Windsor.ai will run the backfill. Monitor the process and validate the contents in BigQuery:
- Check that account-level fields (bio, follower count) and post-level fields (likes, comments, caption, media type) appear as expected.
- Run a few sample SQL queries to confirm date formats, counts, and that no duplicate rows were introduced.
- Verify that the backfill covers the chosen historical range and the daily job runs on schedule.

How I use the data for marketing analysis
Once Instagram public data lands in BigQuery, it becomes a powerful part of my analytics stack:
- Combine post-performance (likes, comments) with campaign or ad data for blended ROAS and engagement analysis.
- Analyze content performance by media type (image, video, carousel) and caption length.
- Create time-based trend charts in Looker Studio to show follower growth, engagement rate, and top-performing posts.
- Benchmark competitor accounts or market segments if public account scraping is allowed and available.

Practical tips and gotchas
- Name tables consistently to make SQL joins predictable.
- Schedule during off-peak hours to reduce the chances of rate limit issues.
- Limit fields when you only need a subset — this keeps storage and query costs lower.
- Use date-based column matching if you plan to rerun recent time windows frequently.
- Validate after backfill — ensure counts match what you expect from the Instagram interface for a few posts and the account-level follower counts are reasonable.
- Keep an eye on connector updates. Windsor.ai keeps adding and improving connectors, which may add new useful fields over time.
Summary
Ingesting Instagram public data into BigQuery with Windsor.ai is a no-code way to get reliable post-level and account-level metrics into a centralized analytics environment. By selecting only the fields you need, scheduling daily updates, and validating the backfill, you’ll unlock SQL-powered analysis and richer Looker Studio dashboards without writing custom API scripts. Use the promo code gaillereports for a discount if you decide to try Windsor.ai.
Further reading and related resources
For more on Looker Studio and data visualization best practices, check the Gaille Reports blog and subscribe page:
- How to Plan Your 2026 eCommerce Marketing Budget (Template)
- Windsor.ai for Looker Studio: Better Filters with Blended Data Sources
- How to install Windsor.ai add-on to Google Sheets?
- Looker Studio Tutorial #11 — How to Build and Style Pie Charts
- Looker Studio Tutorial #10 — How to Build and Style Bar Charts

