
Seamless Data Integration, Unlimited Potential
Discover the simplest way to connect and move your data.
Get hands-on for free, or schedule a demo to see the possibilities for your team.
Estuary Podcast Series: Building Data Pipelines, Analytics, Ad Tech, and Startups
Chase Cottle Andrew James
Co-founder and CTO Co-founder and CEO
Eyeball Division Eyeball Division

Join hosts Dani and Zulf for a fast-paced walkthrough of how to design and ship right-time data pipelines with Estuary. In this session, you’ll get: - Context: What “right-time” really means, where Estuary fits among batch vs. streaming and managed vs. self-hosted options, and why unified ingestion reduces cost and complexity. - Live End-to-End Demo: Connect CDC sources, apply declarative transformations, and materialize data simultaneously into a warehouse, analytical engines, and object storage—plus a look at observability, error recovery, and real-world scenarios like schema drift and backfills. - Live Q&A: Ask about your specific stack, pipeline designs, and how to scale Estuary for enterprise workloads. Perfect for data and analytics engineers, architects, and platform owners who want fresher data with fewer moving parts.

Ingest data into a Snowflake warehouse using real-time Snowpipe Streaming or using batch COPY INTO commands. Estuary makes Snowflake integration simple with pre-built no-code connectors. Following along? Find the copy/pasteable commands in Estuary’s Snowflake docs: https://docs.estuary.dev/reference/Connectors/materialization-connectors/Snowflake/ - Set up your first data pipeline for free at Estuary: https://dashboard.estuary.dev/register/?utm_source=youtube&utm_medium=social&utm_campaign=snowflake_ingestion - Learn more about Estuary’s Snowflake capabilities: https://estuary.dev/solutions/technology/real-time-snowflake-streaming/ - Read the complete guide to Snowpipe Streaming: https://estuary.dev/blog/snowpipe-streaming-fast-snowflake-ingestion/ - Discover how Snowflake fared in Estuary’s Data Warehouse Benchmark: https://estuary.dev/data-warehouse-benchmark-report/ - Download Snowflake Ingestion Playbook: https://estuary.dev/snowflake-ingestion-whitepaper/ FAQ 1. What is the fastest way to load data into Snowflake? Snowpipe Streaming with row-based ingestion. In Estuary, you can enable it per table using Delta Updates. 2. Why use key pair authentication for Snowflake? It provides strong security, short-lived tokens, and is Snowflake’s recommended approach for service integrations like Estuary. 3. Can I mix real-time and batch ingestion in the same pipeline? Yes. With Estuary’s Snowflake connector, you can run some tables in batch (COPY INTO or Snowpipe) and others in real time with Snowpipe Streaming. Media resources used in this video are from Pexels, Canva, and the YouTube Studio Audio Library. 0:00 Introduction 1:05 Snowflake concerns 1:51 Ingestion options 3:23 Beginning the demo 3:47 Create Snowflake resources 4:28 User auth setup 5:17 Estuary connector config 6:44 Customization options 8:07 Wrapping up
Discover the simplest way to connect and move your data.
Get hands-on for free, or schedule a demo to see the possibilities for your team.