Estuary
icon
REAL-TIME ETL & CDC

Stream into Amazon S3 Parquet with your free account

Continously ingest and deliver both streaming and batch change data from 100s of sources using Estuary's custom no-code connectors.

  • <100ms Data pipelines
  • 100+ Connectors
  • 2-5x less than batch ELT
01. Select a source02. Transform in-flight03. Deliver to Amazon S3 Parquet
Amazon S3 Parquet logo
take a tour
Amazon S3 Parquet logo

Amazon S3 Parquet connector details

The Amazon S3 Parquet materialization connector writes delta updates from Estuary Flow collections to an Amazon S3 bucket in Apache Parquet format, providing efficient, columnar storage optimized for analytics and downstream data lake use cases.

  • Data format: Outputs batched delta updates as Parquet files for compact, query-ready storage
  • Upload scheduling: Configure upload intervals and file size limits to control data batching frequency
  • Flexible authentication: Supports both AWS Access Keys and IAM roles for secure access
  • Schema-aware typing: Automatically maps Flow collection field types to equivalent Parquet data types
  • File versioning: Organizes files by path and version counters for easy traceability and reprocessing
  • Scalable and compatible: Works with AWS S3 and S3-compatible APIs, such as MinIO or Wasabi

💡 Tip: Use this connector to build cost-efficient, analytics-ready data lakes by streaming Flow data to S3 in Parquet format, ready for querying in Athena, Snowflake, or Databricks.

For more details about the Amazon S3 Parquet connector, check out the documentation page.

How to connect your data source to Amazon S3 Parquet in 3 easy steps

1

Connect your data source

Select from more than 100 supported databases and SaaS platforms including PostgreSQL, MySQL, SQL Server, MongoDB, and Kafka.

2

Prepare and transform your data

Apply transformations and schema mapping as data moves whether you are streaming in real time or loading in batches.

3

Sync to Amazon S3 Parquet

Continuously or periodically deliver data into your destination with support for change data capture and reliable delivery for accurate insights.

Get Started Free

Trusted by data teams worldwide

All data connections are fully encrypted in transit and at rest. Estuary also supports private cloud and BYOC deployments for maximum security and compliance.

icon-2

HIGH THROUGHPUT

Distributed event-driven architecture enable boundless scaling with exactly-once semantics.

icon-3

DURABLE REPLICATION

Cloud storage backed CDC w/ heart beats ensures reliability, even if your destination is down.

icon-1

REAL-TIME INGESTION

Capture and relay every insert, update, and delete in milliseconds.

Real-timehigh throughput

Point a connector and replicate changes to Amazon S3 Parquet in <100ms. Leverage high-availability, high-throughput Change Data Capture.Or choose from 100s of batch and real-time connectors to move and transform data using ELT and ETL.

  • Ensure your Amazon S3 Parquet insights always reflect the latest data by connecting your databases to Amazon S3 Parquet with change data capture.
  • Or connect critical SaaS apps to Amazon S3 Parquet with real-time data pipelines.

See how you can integrate any source with Amazon S3 Parquet:

Details

or choose from these popular data sources:

PostgreSQL logo
PostgreSQL
MySQL logo
MySQL
SQL Server logo
SQL Server
MongoDB logo
MongoDB
Apache Kafka logo
Apache Kafka
BigQuery logo
BigQuery
Snowflake Data Cloud logo
Snowflake Data Cloud

Don't see a connector?Request and our team will get back to you in 24 hours

Pipelines as fast as Kafka, easy as managed ELT/ETL, cheaper than building it.

Feature Comparison

EstuaryBatch ELT/ETLDIY PythonKafka
Price$$$-$$$$$-$$$$$-$$$$
Speed<100ms5min+Varies<100ms
EaseAnalysts can manageAnalysts can manageData EngineerSenior Data Engineer
Scale
Maintenance EffortLowMediumHighHigh
Detailed Comparison

Deliver real-time and batch data from DBs, SaaS, APIs, and more

Connection-1

Popular sources/destinations you can sync your data with

Choose from more than 100 supported databases and SaaS applications. Click any source/destination below to open the integration guide and learn how to sync your data in real time or batches.