
Salesforce is a system of record for revenue teams, but when it comes to analytics, it often becomes a bottleneck. Whether you're tracking opportunity pipelines, forecasting revenue, or analyzing account engagement, Salesforce data is critical—but challenging to access in real time, at scale, and in a format that suits analytical workloads.
Reports lag. API limits get in the way. And formula fields—essential for business logic—don’t always reflect the latest state without extra work.
That’s where ClickHouse comes in. As a high-performance, columnar OLAP database, ClickHouse offers sub-second query speeds over billions of rows—perfect for real-time dashboards, pipeline health checks, and advanced segmentation.
But bridging the gap between a SaaS platform like Salesforce and a performant analytics warehouse like ClickHouse isn’t easy. Historically, it required batch jobs, custom pipelines, or stitching together tools like Kafka and Debezium.
Estuary Flow simplifies that architecture.
Estuary Flow is a real-time, low-code data integration platform with native support for Salesforce. It captures data using the Bulk and REST APIs, keeps formula fields up to date on a schedule, and automatically handles schema changes. Most importantly, Flow exposes this data through a Kafka-compatible interface—Dekaf—so ClickHouse can ingest it directly using ClickPipes, with no Kafka to deploy or manage.
In this guide, you’ll learn how to stream Salesforce data to ClickHouse in real time using Estuary Flow—syncing objects, handling schema changes, and enabling low-latency analytics without Kafka or custom code.
Why Real-Time Salesforce Analytics Matters
Salesforce stores critical business data, but using it for analytics is a constant challenge. Reports are limited, formula fields don’t update reliably, and API limits block frequent syncs, making real-time visibility nearly impossible.
For teams that need up-to-date insights—whether it’s tracking pipelines, forecasting revenue, or analyzing deal velocity—these limitations add friction and delay.
ClickHouse offers a powerful solution. Its high-performance, columnar engine is built for fast, large-scale analytics, but it doesn’t natively support API-based platforms like Salesforce.
This is why a real-time Salesforce to ClickHouse integration matters. With a reliable way to stream Salesforce data into ClickHouse, teams can build fast dashboards, run advanced queries, and make better decisions, without relying on stale exports or batch jobs.
Step-by-Step Guide: Streaming Salesforce Data to ClickHouse with Estuary Flow
You can set up a real-time Salesforce to ClickHouse pipeline in just a few steps using Estuary Flow. Here’s how to get started:
Prerequisites
Before you begin, make sure you have the following:
For Estuary Flow:
- An active Estuary Flow account
For Salesforce:
- A Salesforce account on the Enterprise tier (or with sufficient API limits)
- A Salesforce user with read-only access to the desired objects
For ClickHouse:
- ClickHouse Cloud account with ClickPipes enabled
Step 1: Set Up Salesforce as the Source
Native, real-time Salesforce capture in Estuary Flow
- Log in to Estuary Flow
- Go to Sources and click + NEW CAPTURE
- Search for Salesforce and select the connector
- Authenticate via OAuth using the read-only Salesforce user credentials
- Optionally, set a Start Date to control historical backfill
- Choose which Salesforce objects to capture (standard or custom)
- Click NEXT, then SAVE AND PUBLISH to launch the capture
Pro Tip: To keep formula fields accurate—even when Salesforce doesn’t flag a change—set a scheduled formula refresh (via cron schedule) per object in the resource settings.
Step 2: Stream to ClickHouse via Kafka-Compatible Materialization
Real-time ClickHouse materialization using Dekaf and ClickPipes
- From the capture view, click MATERIALIZE, or go to the Destinations tab
- Click + NEW MATERIALIZATION and search for Clickhouse
- Configure your materialization:
- Unique Name
- Auth Token of your choice to secure your connection with ClickHouse
- Choose the Salesforce collections you’d like to materialize; these will be bound to Kafka-compatible topics
- Click NEXT, then SAVE AND PUBLISH to finalize
No need to run Kafka—Dekaf handles protocol compatibility behind the scenes.
Step 3: Connect ClickHouse via ClickPipes
Ingest data from Estuary Flow directly into ClickHouse
- In your ClickHouse Cloud dashboard, go to Integrations → Apache Kafka
- Configure ClickPipe with these connection details:
- Broker:
dekaf.estuary-data.com:9092
- Schema Registry:
https://dekaf.estuary-data.com
- Protocol:
SASL_SSL
- Mechanism:
PLAIN
- SASL Username: Your full materialization name
- Password: The auth token you provided in Step 2
- Broker:
- Map incoming topics (collections) to ClickHouse tables
- Save and activate the pipe
You’ll begin streaming real-time Salesforce data into ClickHouse within seconds.
Monitor and Validate Your Pipeline
Once your Salesforce to ClickHouse pipeline is live, you can monitor its health and ensure everything is working as expected:
- View sync activity, latency, and delivery status in the Estuary Flow dashboard
- Use the ClickHouse Cloud UI to inspect target tables and validate incoming rows
- Query real-time data instantly—e.g., track opportunity value, pipeline velocity, or account engagement
- Schedule alerts or use observability tools by integrating Estuary’s OpenMetrics API with systems like Prometheus or Datadog.
Estuary automatically manages schema mapping, formula field refresh, and exactly-once delivery—so your pipeline stays reliable, even as data evolves.
Troubleshooting Tips
- Missing fields in ClickHouse?
→ Check Salesforce field-level permissions for your Flow user. - Formula values not updating?
→ Ensure a cron schedule is set for formula field refresh in each object binding. - Capture not progressing?
→ Confirm your Salesforce user has the correct API access and the Bulk API is enabled. - ClickHouse ingestion errors?
→ Recheck your SASL credentials and schema registry config in the ClickPipe setup. - High latency or sync gaps?
→ Use Estuary’s dashboard to inspect collection lag and connector health. Consider scaling up the data plane or reviewing your API rate usage.
Key Benefits of Streaming Salesforce to ClickHouse with Estuary Flow
Whether you're powering dashboards, forecasting revenue, or tracking customer lifecycle metrics, this integration gives your team the tools to move fast, stay accurate, and eliminate unnecessary infrastructure.
Here’s what sets it apart:
Real-Time Salesforce Analytics
Stop waiting on batch jobs or stale reports. Estuary captures updates from Salesforce—including custom objects—and streams them into ClickHouse within seconds. Your metrics and dashboards stay live and up-to-date.
Automatic Handling of Formula Fields
Salesforce formula fields don’t show up in normal CDC tools because they don’t trigger record modifications. Estuary solves this by refreshing formula values on a schedule you define, keeping calculations current in ClickHouse.
Schema-Aware + No-Code Setup
Flow auto-generates and maintains JSON schema collections, even when Salesforce objects change. You don’t need to manage DDL or write transformation code—just point, configure, and publish.
Secure, Scalable, Enterprise-Ready
From OAuth-based Salesforce authentication to ClickHouse-native ClickPipes, Flow supports secure credentials, scoped access, and cloud-scale throughput. You can deploy in Estuary's cloud or bring your own.
No Kafka. No Glue Code. No Ops Overhead.
Dekaf delivers Kafka-compatible streaming without deploying or managing Kafka itself. That means no brokers, no ZooKeeper, no DevOps burden—just a direct stream from Salesforce into ClickHouse.
Ready to see the impact in action? Let’s explore a real-world use case next.
Use Case: Real-Time Revenue Pipeline Dashboard
Imagine you're leading revenue operations at a fast-growing B2B company. Your Salesforce CRM tracks every opportunity, from early-stage deals to Closed Won revenue. But your leadership team wants more than end-of-week snapshots—they need a live view of bookings, sales velocity, and pipeline health.
With Estuary Flow streaming Salesforce data directly into ClickHouse, you can power real-time dashboards that show:
- Total deal value by stage, region, or owner
- Win rate trends updated minute-by-minute
- Historical pipeline comparisons and cohort conversion rates
- Alerting when high-value deals move or stall
And because ClickHouse handles sub-second queries across billions of rows, you can build granular visualizations without worrying about performance bottlenecks.
Better yet, the data is:
- Always current — no need to refresh exports
- Trustworthy — formula fields are synced and schemas are enforced
- Ready for BI — connect ClickHouse to tools like Metabase, Tableau, or Superset
The result? GTM teams stay aligned, leaders gain visibility, and decision-making gets faster, with zero load on your Salesforce instance.
Conclusion: A Better Way to Power Salesforce Analytics
Bringing Salesforce data into ClickHouse in real time used to mean stitching together Kafka, managing ETL scripts, and hoping your formula fields didn’t break the pipeline. With Estuary Flow, those days are over.
You get a scalable, low-code solution to stream Salesforce data—standard objects, custom fields, and formulas—directly into ClickHouse with schema enforcement, exactly-once delivery, and zero Kafka infrastructure.
Whether you're building live dashboards, syncing CRM metrics into your data stack, or enabling advanced reporting for GTM teams, this integration delivers the speed, flexibility, and reliability your business needs.
Ready to Get Started?
- Try Estuary Flow — Spin up your first Salesforce → ClickHouse pipeline in minutes
- Join the Estuary Community — Get support, share feedback, and connect with other data teams
- Talk to our team — Let’s discuss how this fits your stack, security needs, or data roadmap
Real-time CRM analytics doesn’t have to be complicated. With Estuary and ClickHouse, it’s finally simple.
Related Guides

About the author
Team Estuary is a group of engineers, product experts, and data strategists building the future of real-time and batch data integration. We write to share technical insights, industry trends, and practical guides.
