Who is this guide for? This guide is for administrators and data engineers planning to connect a data warehouse to Quivly for product usage data.
Why Connect a Data Warehouse?
While CRM and billing integrations provide customer and revenue data, data warehouse integrations unlock your product usage data - the most powerful signal for predicting churn and identifying growth opportunities. What you can track:- Daily/weekly active users per customer
- Feature adoption and usage frequency
- API calls and volume metrics
- Product engagement trends
- Custom usage metrics specific to your product
- Identify customers not using key features
- Detect usage declines before customers churn
- Find expansion opportunities (high usage = ready to upgrade)
- Understand which features drive retention
Supported Data Warehouses
Quivly supports connections to the following data warehouses:BigQuery
Google Cloud Platform
- Serverless, highly scalable
- Great for large datasets
- Authentication via service account
Snowflake
Snowflake Data Cloud
- Enterprise-grade data warehouse
- Multi-cloud support
- Authentication via username/password or key-pair
PostgreSQL
PostgreSQL Database
- Open-source, self-hosted or managed
- Great for smaller datasets
- Authentication via connection string
MySQL
MySQL Database
- Popular open-source database
- Widely supported
- Authentication via connection string
Redshift
Amazon Redshift
- AWS-native data warehouse
- Integrates with AWS ecosystem
- Authentication via IAM or credentials
Databricks
Databricks Lakehouse
- Unified data and AI platform
- Delta Lake integration
- Authentication via personal access token
Data Format: Long vs Wide
Before connecting your warehouse, understand the two common data formats for usage data:Long Format (Event-Based)
Structure: One row per event Use when: You track individual user actions (clicks, page views, API calls) Example:- Flexible (can aggregate any way you want)
- Granular insights
- Easy to add new event types
- Large table sizes
- Requires aggregation on every query
- Higher query costs
Wide Format (Aggregated Metrics)
Structure: One row per time period with metrics as columns Use when: You pre-aggregate usage data (daily/weekly rollups) Example:- Faster queries (no aggregation needed)
- Smaller table sizes
- Lower query costs
- Less flexible (metrics fixed)
- Must decide aggregations upfront
- Harder to add new metrics
Prerequisites for All Warehouses
Before connecting any data warehouse:1
Prepare Your Usage Data
Ensure you have usage data in a table with:
- Customer identifier (customer ID, account ID, or domain)
- Timestamp or date field
- Event names or metric columns
- Optional: User identifier, event properties
Run a test query to verify data exists and is structured correctly.
2
Create Read-Only Access
For security, create a dedicated user/service account with read-only permissions:BigQuery: Service account with “BigQuery Data Viewer” roleSnowflake: Database user with SELECT permission on usage tablesPostgreSQL/MySQL: User with SELECT permission
3
Test Connectivity
Verify Quivly can reach your data warehouse:
- If cloud-hosted: Usually no firewall changes needed
- If self-hosted: Whitelist Quivly’s IP addresses
- If VPN/private network: Set up secure tunnel or VPN connection
4
Plan Sync Frequency
Decide how often Quivly should sync usage data:
| Frequency | Best For | API/Query Costs |
|---|---|---|
| Every 6 hours | Real-time usage tracking | Higher |
| Every 12 hours | Near real-time tracking | Medium |
| Daily | Cost-effective, sufficient for most | Lower |
| Manual | Infrequent updates, testing | Lowest |
Sync Behavior Across Warehouses
How Syncs Work
All warehouse integrations follow the same pattern: Initial Sync:- Quivly queries your usage table for the last 90 days (configurable)
- Data is ingested and processed
- Usage metrics are calculated per customer
- Health scores are updated
- Quivly queries only new data since last sync (using timestamp/date field)
- New data is appended
- Metrics are recalculated
- Health scores are updated
- Long format: Deduplicates based on (timestamp, customer_id, user_id, event_name)
- Wide format: Deduplicates based on (date, customer_id) - newer values overwrite
Query Examples
Long format incremental query:Security and Permissions
What Permissions Does Quivly Need?
| Warehouse | Required Permissions | Why |
|---|---|---|
| BigQuery | bigquery.tables.get, bigquery.tables.getData, bigquery.jobs.create | Read table schema, query data, run jobs |
| Snowflake | SELECT on usage tables | Query usage data |
| PostgreSQL | SELECT on usage tables | Query usage data |
| MySQL | SELECT on usage tables | Query usage data |
| Redshift | SELECT on usage tables | Query usage data |
Data Security
How Quivly protects your data:- Read-only access (no write permissions)
- Encrypted connections (TLS/SSL)
- Credentials stored encrypted at rest
- Row-level security (organization isolation)
- No data shared across customers
- Only usage data from tables you configure
- No PII unless explicitly mapped
- Data is aggregated (not raw event-level details)
- Other tables or schemas
- User passwords or sensitive fields
- Financial transaction details (unless explicitly configured)
Need help choosing a warehouse or setting up your data?
- Email support: [email protected]
- Book a call: Schedule onboarding
- BigQuery guide: BigQuery Setup →

