Estuary is building the next generation of real-time data integration solutions.
We're creating a new kind of DataOps platform that empowers engineering teams to build real-time, data-intensive pipelines and applications at scale, with minimal friction, in a UI or CLI. We aim to make real-time data accessible to the analyst, while bringing power tooling to the streaming enthusiast. Flow unifies a team's databases, pub/sub systems, and SaaS around their data, without requiring new investments in infrastructure or development.
Estuary develops in the open to produce both the runtime for our managed service and an ecosystem of open-source connectors. You can read more about our story here.

History
We didn't start at trying to make real-time data flows more accessible by abstracting away the low-level work… it is just what happened.
Our team researched streaming frameworks while working with billions of daily events, and ultimately realized that we'd have to build our own (Gazette) to have a scalable distributed streaming framework that is built with kappa architectures in mind, required less continuous resource management, and could unify both our batch and real-time pipelines.
We've been innovating to make real-time data more accessible to all ever since.
In the Media



Meet Our Team
















Our Investors


Careers
About you: You're passionate about the complexities and potential of our data-driven world, self-motivated, curious, and adaptable.
About us: We're a rapidly growing, highly technical team built by successful repeat founders that's working to take the friction out of data engineering.
Working at Estuary
Location
We offer offices in both our New York City and Columbus, Ohio locations, as well as the ability to work remotely.
Benefits
We provide 100% employee coverage on healthcare, 401k, competitive equity, and unlimited time vacation leave.
Culture
Like the product we build, our culture is forward-thinking and open. Our team operates on a foundation of trust, is resourceful, collaborative, but also independent.
Community
We offer perks including team happy hours, weekly lunches, and quarterly off-sites. With our rapid growth, now's an exciting time to come aboard.
Know somebody who would be a good fit? We offer a $2,500 referral bonus.
Current Openings
Location: Remote within the US
Senior Backend Engineer
Remote | Full-time
Estuary Flow is a real-time data integration platform built for both fast-moving developers and large-scale enterprises. It combines change data capture, stream processing, and declarative configuration into a unified system that simplifies complex data movement. With Flow, teams can build reliable, low-latency pipelines across systems – without the overhead of managing data infrastructure.
As a senior backend engineer, you’ll work on high-impact, technically challenging projects – from performance-critical stream processing to scaling our infrastructure and evolving our developer experience. This role will be part of our integrations team but there is room to rotate and contribute wherever your skills have the most impact.
You’ll join a high-trust, high-impact team, and help shape core parts of our backend — from system architecture and performance optimization to enabling developers to build and deploy integrations quickly and reliably.
What You’ll Do
- Design, implement, deploy, test, and maintain substantial projects primarily written in Python.
- Lead technical initiatives around distributed systems, fault tolerance, and performance optimization
- Help shape our connector framework and build the tools that allow data to flow in real time from hundreds of APIs and databases
- Contribute to architectural decisions that improve scalability, maintainability, and developer experience
- Work closely with founders, product leadership, and other engineers to scope and deliver meaningful product improvements
- Write production-grade code, participate in code reviews, and mentor teammates
What We’re Looking For
- 5+ years of industry experience in backend software development, ideally working with distributed systems, stream processing, or real-time infrastructure
- Proficiency in Go and Python is preferred, with a willingness to learn new technologies. Experience with Rust is a plus!
- Comfort building complex systems with a focus on performance and correctness
- Interest in developer experience and building tools that make engineers more effective
- Clear communication skills and a track record of successful cross-functional collaboration
- Bachelor’s degree in Computer Science, Engineering, or equivalent practical experience
Bonus Points For
- Experience with change data capture (CDC), event streaming, or data integration tools (e.g., Airbyte, Fivetran, Debezium)
- Contributions to open source projects, especially in the data ecosystem
- Background working in early-stage startups or fast-moving engineering teams
- Familiarity with cloud-native infrastructure (GCP, AWS)
Why Estuary?
Estuary is VC-backed and led by experienced, repeat founders. We're a small but growing team that values ownership, fast iteration, and helping each other level up.
We offer:
- Competitive compensation, equity, and full benefits
- Flexible remote work (with hybrid options in NYC)
- A strong engineering culture with low ego and high autonomy
- Quarterly team offsites in fun locations like Miami, Austin, Boulder, and New Orleans
Location: United States / Remote
Backend Engineer
Estuary is seeking an enthusiastic and talented Software Engineer to join our dynamic team and contribute to the development of our cutting-edge data integration platform.
This role is focused on extending our capability to create connectors to various systems. The integration team develops connectors which move data in real-time between a multitude of sources and destinations and we’re looking for an engineer to help us efficiently scale the process of creating them by enhancing systems and frameworks to make connector development easier.
What You'll Do
- Design, implement, deploy, test, and maintain substantial projects primarily written in Python.
- Develop data integration connectors for capturing data from various external APIs, becoming a subject matter expert in extracting data from these systems.
- Identify common patterns used in data extraction from APIs and build abstractions to streamline connector development.
- Participate in designing and developing large-scale distributed systems for seamless data exchange.
- Write code, participate in reviews, troubleshoot, and debug technical issues.
- Stay updated on current technologies and data processing advancements.
What We're Looking For
- 3+ years of relevant industry experience or equivalent proficiency gained through internships, coursework, or personal projects.
- Proficiency in Python is preferred, with a willingness to learn new technologies.
- Understanding of software development principles, debugging, and version control.
- Interest in data integration principles and real-time data processing.
- Strong problem-solving skills and ability to work both independently and collaboratively.
- Effective written and verbal communication skills.
- Bonus: Exposure to distributed systems and cloud environments, involvement in open-source communities, or experience in startup environments.
Let's talk about your data
Have a specific question or comment? Send us a note and a team member will reach out to you shortly.
If you don't see the form, please click here to open it in a new tab.