Integration Guide

How to Connect InstantDM with BigQuery or Snowflake (Data Warehouse Integration)

Learn how to stream Instagram DM event data from InstantDM to BigQuery or Snowflake for advanced analytics and reporting.

Meta Business Partner
30,000+ creators
$9.99/mo flat

For teams that need advanced analytics beyond spreadsheets, streaming InstantDM event data to a data warehouse like BigQuery or Snowflake gives you SQL-powered analysis, historical trends, and the ability to join Instagram data with other business data.


Why Use a Data Warehouse?

  • SQL analysis - query Instagram lead data with full SQL power
  • Historical trends - track lead volume, conversion rates, and campaign performance over time
  • Join with other data - combine Instagram leads with sales, revenue, and marketing data
  • Scalable - handle millions of events without performance issues
  • BI tool integration - connect to Looker, Tableau, Metabase, or Google Data Studio

Common Use Cases

  • Analyze lead conversion rates by Instagram campaign
  • Track which DM flows generate the most revenue
  • Build executive dashboards combining Instagram and sales data
  • Run cohort analysis on Instagram leads
  • Measure ROI of Instagram DM automation

What You'll Need

Requirement Details
InstantDM account Trendsetter, Trendsetter Pro, or any Multi plan
Data warehouse BigQuery (Google Cloud) or Snowflake
InstantDM API key Found at Settings → API in your InstantDM dashboard

BigQuery Integration

Step 1: Create a BigQuery Dataset and Table

  1. Go to console.cloud.google.com/bigquery.
  2. Create a dataset: instagram_data.
  3. Create a table: events with this schema:
CREATE TABLE instagram_data.events (
  event_id STRING,
  event_type STRING,
  timestamp TIMESTAMP,
  instagram_user_id STRING,
  username STRING,
  flow_name STRING,
  full_name STRING,
  email STRING,
  phone STRING,
  interest STRING,
  comment_text STRING,
  message_text STRING,
  post_url STRING,
  raw_payload JSON,
  inserted_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP()
);

Step 2: Build a Webhook Receiver

const express = require('express');
const { BigQuery } = require('@google-cloud/bigquery');
const { v4: uuidv4 } = require('uuid');
const app = express();
app.use(express.json());

const bigquery = new BigQuery({ projectId: 'your-project-id' });
const dataset = bigquery.dataset('instagram_data');
const table = dataset.table('events');

app.post('/instantdm-webhook', async (req, res) => {
  const { event, timestamp, data } = req.body;
  
  const row = {
    event_id: uuidv4(),
    event_type: event,
    timestamp: timestamp,
    instagram_user_id: data.instagram_user_id,
    username: data.username,
    flow_name: data.flow_name || null,
    full_name: data.response_variables?.full_name || null,
    email: data.response_variables?.email || null,
    phone: data.response_variables?.phone || null,
    interest: data.response_variables?.interest || null,
    comment_text: data.comment_text || null,
    message_text: data.message_text || null,
    post_url: data.post_url || null,
    raw_payload: JSON.stringify(req.body),
  };
  
  try {
    await table.insert([row]);
    res.status(200).json({ status: 'inserted' });
  } catch (error) {
    res.status(500).json({ status: 'error', message: error.message });
  }
});

app.listen(3000);

Example Queries

Leads per day:

SELECT DATE(timestamp) as date, COUNT(*) as leads
FROM instagram_data.events
WHERE event_type = 'flow_completed' AND email IS NOT NULL
GROUP BY date ORDER BY date DESC;

Top performing flows:

SELECT flow_name, COUNT(*) as completions
FROM instagram_data.events
WHERE event_type = 'flow_completed'
GROUP BY flow_name ORDER BY completions DESC;

Lead conversion funnel:

SELECT
  COUNT(DISTINCT CASE WHEN event_type = 'comment' THEN username END) as commenters,
  COUNT(DISTINCT CASE WHEN event_type = 'dm_received' THEN username END) as dm_senders,
  COUNT(DISTINCT CASE WHEN event_type = 'flow_completed' THEN username END) as flow_completers,
  COUNT(DISTINCT CASE WHEN event_type = 'flow_completed' AND email IS NOT NULL THEN username END) as leads_with_email
FROM instagram_data.events
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 30 DAY);

Snowflake Integration

Step 1: Create a Table

CREATE TABLE INSTAGRAM_DATA.PUBLIC.EVENTS (
  EVENT_ID VARCHAR,
  EVENT_TYPE VARCHAR,
  EVENT_TIMESTAMP TIMESTAMP_TZ,
  INSTAGRAM_USER_ID VARCHAR,
  USERNAME VARCHAR,
  FLOW_NAME VARCHAR,
  FULL_NAME VARCHAR,
  EMAIL VARCHAR,
  PHONE VARCHAR,
  INTEREST VARCHAR,
  COMMENT_TEXT VARCHAR,
  MESSAGE_TEXT VARCHAR,
  POST_URL VARCHAR,
  RAW_PAYLOAD VARIANT,
  INSERTED_AT TIMESTAMP_TZ DEFAULT CURRENT_TIMESTAMP()
);

Step 2: Use Snowflake's REST API or Snowpipe

For real-time ingestion, use Snowflake's REST API or stage data in S3/GCS and use Snowpipe for automatic loading.

The webhook receiver pattern is the same - receive the event, format the data, and insert into Snowflake.


Via Make.com (No-Code)

  1. Set up the webhook per the Make.com guide.
  2. Add a Google BigQuery → Insert Rows module (or use HTTP for Snowflake).
  3. Map the webhook fields to table columns.
  4. Activate.

Connecting BI Tools

Google Data Studio / Looker Studio

  1. Go to lookerstudio.google.com.
  2. Add a BigQuery data source → select your events table.
  3. Build charts: leads over time, top flows, interest distribution.

Metabase

  1. Connect Metabase to your BigQuery or Snowflake instance.
  2. Create questions (queries) and dashboards.
  3. Share with your team.

Troubleshooting

Issue Solution
BigQuery insert failing Check the table schema matches the row data. Ensure the service account has bigquery.dataEditor role.
Snowflake connection timeout Verify your Snowflake account URL and credentials. Check network/firewall settings.
Duplicate rows Add a unique event_id and use MERGE or dedup queries.
High costs BigQuery charges for storage and queries. Use partitioned tables (by date) and limit query scans.

Frequently Asked Questions

Is BigQuery free?

BigQuery offers 10 GB free storage and 1 TB free queries per month. For most InstantDM use cases, you'll stay within the free tier.

When should I use a data warehouse vs. Google Sheets?

Use Google Sheets for simple logging with fewer than 10,000 rows. Use a data warehouse when you need SQL queries, joins with other data, historical analysis, or BI tool integration.

Can I use PostgreSQL instead?

Yes. The webhook receiver pattern works with any database. PostgreSQL is a great choice for smaller-scale analytics. See the PostgreSQL Sync guide for details.


What's Next

Ready to Automate Your Instagram DMs?

Join 30,000+ creators and brands using InstantDM today.

Start Your Free Trial

No credit card required. Setup in under 15 minutes.