
Imagine a world where freight tracking isn't just a lagging indicator of where your shipment was hours ago, but a real-time dashboard showing its precise location, delay factors, and live ETAs. That’s what we built using Estuary Flow, Tinybird, and a streaming-first mindset.
The logistics industry moves trillions of dollars in goods every year, but real-time tracking has traditionally been expensive and complex. It requires Kafka pipelines, manual ETL work, and brittle integrations. With Estuary and Tinybird, however, you can ditch the complexity and deliver a live freight tracking experience at a fraction of the cost.
In this post, we’ll explain how we built this real-time freight tracking dashboard, from CDC ingestion from MongoDB to Kafka API integration using Estuary Dekaf and Tinybird’s real-time analytics API.
The Architecture: MongoDB → Estuary Flow → Tinybird
At its core, the system ingests freight and traffic data from MongoDB, continuously captures changes via Estuary Flow, and streams the data into Tinybird for real-time aggregation and visualization.
- MongoDB: The source database containing shipments and traffic updates.
- Estuary Flow: Captures CDC (Change Data Capture) from MongoDB and streams it in real-time.
- Tinybird: Ingests the live data, runs analytics on it, and provides APIs for visualization.
- NextJS & Tremor: React-based UI framework for visualizing shipment status and delays.
Estuary and Tinybird: Operational Analytics Powerhouse
Estuary and Tinybird are a powerful combination that provides software engineers and data practitioners with a high-scale, easy-to-use, and developer-friendly analytics stack.
Estuary, a streaming-first integration platform, makes it easy to move data in real-time. It offers no-code connectors for various data sources, including MongoDB, MySQL, PostgreSQL, and Kafka, making it easy to get started.
Tinybird is a real-time analytics platform that complements Estuary by providing a wide range of analytics functions, including time-series analysis and anomaly detection. Tinybird also offers a user-friendly interface that makes exploring data, creating dashboards, and building real-time applications easy.
Together, Estuary and Tinybird provide a complete solution for real-time analytics. Estuary makes it easy to capture and process data in real time, while Tinybird provides the tools needed to analyze and visualize the data. This combination enables software engineers and data practitioners to build real-time applications that can make a difference.
What is Change Data Capture (CDC) and Why It Matters?
Change Data Capture (CDC) is a technique for tracking changes (INSERTs, UPDATEs, DELETES) in a database and streaming them in real-time. Unlike traditional batch-based ETL, CDC provides low-latency updates, reducing delays and preventing data inconsistency.
With Estuary Flow's CDC integration, any new, modified, or deleted shipments in MongoDB are immediately captured and made available for processing in real time.
The Data
Let’s take a look at our demo data before we jump into building our pipeline. We’ll use MongoDB Atlas for this tutorial, but any flavor will work.
The project aims to extract and combine real-time change records from two MongoDB collections: one for shipment data and another for traffic and weather information related to shipment routes. The combined data will be analyzed and displayed on a real-time dashboard.
shipments
javascript{
"_id": "shipment123",
"route_id:"2",
"shipment_id": "SHP12345",
"customer_id": "CUST001",
"origin": "Port of Santos, Brazil",
"destination": "Los Angeles, USA",
"status": "In Transit",
"current_location": { "latitude": -23.965, "longitude": -46.325 },
"expected_delivery_date": "2025-02-05T18:00:00Z",
"events": [{ "checkpoint": "Port of Santos", "timestamp": "2025-01-26T08:00:00Z" }],
"delays": []
}
traffic_weather
javascript{
"route_id": "RTE001",
"traffic_condition": "Moderate",
"weather_condition": "Rain",
"impact_on_ETA_minutes": 30,
"last_update": "2025-01-27T12:15:00Z"
}
As you can see from the examples, the collections share a field called route_id, which we can use to join them and calculate real-time metrics, such as ETA, average delays, and others.
If you’re following along with the tutorial, you can spin up the fake data generator script by grabbing it from this repository and executing the following:
plaintextdocker compose up
Make sure to update the MongoDB connection details in the docker-compose.yaml file before running.
Now, we need to stream updates from MongoDB into Tinybird without writing a single cron job or spinning up a whole Kafka cluster and Debezium. That’s where Estuary Flow comes in.
Step 1: Setting Up a MongoDB Capture Connector
Using CDC, Estuary Flow can listen to every insert, update, and delete in MongoDB in real time. To set up the connector, you’ll need three things:
- Credentials for connecting to your MongoDB instance and database.
- Read access to your MongoDB database(s), see Role-Based Access Control for more information.
- If you are using MongoDB Atlas or your MongoDB provider requires allowlisting of IPs, you need to allowlist the Estuary IP addresses.
Once you’ve made sure these are in place, you can continue setting up the capture connector.
Setting Up the Capture
- Go to the Estuary Flow Dashboard → Click Sources → Click + New Capture.
- Search for the MongoDB connector and click Capture
- Configure your connection details:
- Address: mongodb://your-mongo-host:27017
- User: database username
- Password: password for the user
- Database: shipping
- Collections: shipments, traffic_weather
The rest of the configuration parameters can be kept as defaults for this tutorial.
- Click Save and Publish – Estuary now listens to real-time updates from MongoDB! The connector will do a complete backfill to load all the existing data in the collections, and it will also start to ingest new changes as well continuously.
- Verify: You can verify the collections are receiving data by navigating to their page from the capture connector.
In Estuary Flow, a collection is a fundamental building block that represents a real-time stream of data as JSON files stored in an object storage, such as S3. Capture connectors continuously extract data from source systems, such as MongoDB via change data capture (CDC) and write incoming change records into collections.
On the collection page, you can preview the data stored on the platform to ensure everything is being captured correctly.
Now that Estuary continuously captures data in real-time, the next step is to configure Tinybird so you can start building your analytics queries! The next step is to create a materialization connector for this purpose.
Step 2: Create Dekaf Materialization
What is Dekaf, and Why is it Useful?
Dekaf is Estuary's Kafka-compatible API, allowing users to integrate real-time event streams without managing Kafka clusters.
This means you can:
- Stream MongoDB CDC data directly into Tinybird without extra middleware.
- Avoid the operational complexity of self-hosted Kafka, Debezium, Flink, and so on.
- Maintain Kafka-like semantics without actually running Kafka.
No extra configuration required, Dekaf is enabled by default for all Estuary users all you need to do is create a new connector!
Setting Up The Materialization
- Head over to the Destinations menu, click the New Materialization button and search for Tinybird
- Configure the Materialization connector with an access token that you’ll use when setting up the Tinybird Kafka SourceNavigate to the Destinations menu.
- Make sure all collections are configured to be materialized in the bindings section.
That’s it! You are ready to stream data into Tinybird. Head over to the Tinybird dashboard and finalize the data flow.
Streaming Data to Tinybird
On the Tinybird side, we’ll have to create a Kafka source to start consuming data from Estuary. Let’s take a look at how we can do this in a few steps.
- Create a Kafka Data Source: Navigate to the Tinybird dashboard and click on Create Data Source.
- Configure the Kafka source we’ll be using for the Dekaf Materialization connector.
- Configure the schema registry in order to decode incoming Avro documents.
- Verify the schema of the incoming records using the data preview. If everything looks correct click on “Create Data Source”, wait a few seconds, and watch data flow in real-time!
Step 3: Writing Real-Time Queries in Tinybird
Tinybird is designed for real-time analytics, optimizing queries for large-scale event streams. Unlike traditional SQL databases, it:
- Processes event-driven data instantly.
- Exposes high-performance APIs for real-time dashboards.
Let’s calculate a few metrics for our dashboard.
Metric 1: Top Delayed Customers
This metric will tell us which customer is experiencing the most amount of delays and how many shipments they have in total.
Metric 2: Average Route Performance
This query will calculate the average, maximum and total delay for each route.
Metric 3: Route Status Distribution
Our final metric will help us get an overview of shipments across all routes and what status they are in currently.
Now that we have our metrics defined we can continue to the final step in our tutorial, which is setting up our dashboard. But wait, how are we going to connect Tinybird to a web application? Thankfully, this is super easy using Tinybirds UI. All we have to do is navigate to each pipe we just implemented for our transformations and click the Create API Endpoint button in the top right corner.
Note down the endpoints generated by Tinybird, and head over to the next chapter.
Step 4: Building the Real-Time Dashboard with Next.js and Tremor
Of course, data is not very useful if we don’t have easy access to it. So let’s build a quick dashboard using NextJS and Tremor to visualize our metrics!
The full example for the dashboard with all charts is available in this repository: GitHub.
Install Next.js and Dependencies
javascriptnpx create-next-app real-time-dashboard
cd real-time-dashboard
npm install @tremor/react @tanstack/react-query axios
Fetch Tinybird API Data in Next.js
javascriptconst fetchTinybirdData = async (endpoint) => {
const res = await fetch(`https://api.tinybird.co/v0/pipes/${endpoint}.json?token=your_token`);
return res.json();
};
Render Real-Time Charts with Tremor
javascript<Card className="w-1/2">
<h3 className="ml-1 mr-1 font-semibold">
Top Delayed Customers (minutes)
</h3>
<BarList
data={topDelayedCustomersFiltered}
/>
</Card>
Now, our dashboard updates live as shipments change in MongoDB! 🚀
Wrapping Up
In this tutorial, we explored the straightforward process of integrating Estuary Flow with Tinybird, showcasing how this combination creates a comprehensive and robust solution for operational analytics.
By leveraging Estuary Flow's capabilities for real-time data streaming and Tinybird's proficiency in data transformation and analysis, we demonstrated how to build a powerful stack that can handle the demands of modern data-driven applications. This integration enables businesses to unlock valuable insights from their data in real-time, facilitating faster and more informed decision-making.

About the author
Dani is a data professional with a rich background in data engineering and real-time data platforms. At Estuary, Daniel focuses on promoting cutting-edge streaming solutions, helping to bridge the gap between technical innovation and developer adoption. With deep expertise in cloud-native and streaming technologies, Dani has successfully supported startups and enterprises in building robust data solutions.
Popular Articles
