Data warehouses are becoming more than passive central data stores– they’re actively interacting with the other technologies you use to automate workflows and personalize experiences for your customers.
Snowflake is advancing the functionality of the modern cloud data warehouse with the ability to call external services, using External Functions. Combined with Snowflake’s ability to capture changes to tables and views natively via table streams, you can use these External Functions for a more efficient, cost effective way to orchestrate your reverse ETL pipelines.
In sync with Snowflake Summit in Las Vegas today, we’re excited to announce support for Snowflake Stream Triggers to kick off a reverse ETL sync only when your data changes, saving customers time and credits.
This is especially useful for data that's updated on an irregular frequency but needs to be synced ASAP.
Why it matters
Using Snowflake Stream Triggers will save you Snowflake credits when you need to extract data as quickly as possible from your Snowflake warehouse, but data is updated more frequently than your batch-based, orchestrated data pipelines. If you have low latency use cases and spiky workloads that insert and update data in your data warehouse, you can have Census listen for Snowflake tables and view changes to kick off a sync – instead of polling continuously for changes.
Snowflake’s external functions don’t need to wake up a logical warehouse to run, and are significantly less credit-intensive than running an extra small warehouse continuously to poll frequently for changes. We estimate that using Snowflake Stream Triggers to activate Census syncs can be 10-100x cheaper than traditional orchestration methods.
For batch workflows, Census has a full-featured API to trigger syncs and integrations with your favorite orchestrators like Airflow, Prefect, and Dagster. In these scenarios, Census will wake up your logical data warehouse only when a series of upstream transformations have completed. No need for complex orchestration? No worries – you can set Census on a schedule, and similarly, Census will run regularly and auto-suspend its warehouse when done.
However, what if you need to forward events to destinations so you can trigger time-sensitive workflows like Slack alerts and abandoned cart campaigns? In these cases, you need a fast-path for data that is being streamed into Snowflake and updated continuously. With Snowflake Stream Triggers, Census listens for any changes to your streams and only turns on the logical warehouse when there’s a valid change to sync to your destinations.
In conclusion
Historically, reverse ETL pipelines are either orchestrated explicitly or poll for changes on a regular schedule. If you needed data streams synced as quickly as possible from Snowflake, your best option was to poll your data warehouse frequently to check for changes – keeping a logical data warehouse running and spending credits the entire time.
With Snowflake Stream Triggers for Census, that’s no longer the case. We believe this is a strictly superior way to accomplish continuous syncs that's both cheaper and faster than previous methods.
Interested in trying it out? Get a demo or start your free trial now.
Meet us at Snowflake Summit Booth #1829
If you're at Snowflake Summit this week, see us at Booth #1829 or join our mixers tomorrow:🥞 Data Stack Breakfast with Mixpanel Tuesday June 14th, 7:30 - 9:00 AM PDT
🥂 Happy Hour with Airbyte and DeepnoteTuesday June 14th, 6:30 - 9:30 PM PDT
If you can't make it, we're giving out some sweet raffle prizes anyway 🙌