Estuary

Move Data from Postgres to MongoDB: 3 Ways

Whether you're doing a one-off import or need a real-time data pipeline, here are three ways to connect Postgres to MongoDB. Full steps included.

Blog post hero image
Share this article

Having your data where and when you need it is a game-changer, especially when you need it to make decisions quickly. The type of data storage you have configured for your organization determines how and what your data can be used for. Organizational data needs vary from use case to use case, but for each scenario, you need to be able to migrate between databases when needed without worrying about breaking anything.

With the ever-increasing amount of data being stored every day and the changes it can undergo with time, at some point, you might need to migrate between different databases, depending on your data needs. 

In this article, I'll show you three ways to migrate from Postgres to MongoDB — including the easiest and safest method with Estuary Flow.

In a hurry? Skip the manual steps and migrate effortlessly with Estuary Flow. Start free and move Postgres to MongoDB in minutes.

Jump to Estuary Flow setup steps

What is Postgres?

Postgres is a popular open-source SQL Database. It is an object-relational database, unlike MySQL which is only a relational database. It has more complex data types, allowing you to embed documents and files, and works with different languages like Python, PHP, C, C++, Java, JavaScript, etc. It is usually used as the primary storage for applications due to its flexibility, popularity, security, and reliability.

What is MongoDB?

MongoDB is a non-relational database with a JSON-like structure. It is a great tool to handle whatever form your data comes in and serves it to your customers at scale. It stores objects like documents, databases as clusters, and tables as collections in these clusters.

In this article, I’ll be showing you how to move data from an object-relational database (Postgres) to a non-relational database (MongoDB). Whether your Postgres database is hosted on the cloud or locally on your device, the approach for either of them is similar.

The Hidden Risks of Manual Postgres to MongoDB Migration

You could manually export Postgres data and import it into MongoDB. Many do. But beware:

  • Schema Drift: One overlooked table change and your import fails — or worse, corrupts.
  • Data Loss: Manual exports miss nested relationships or special fields.
  • Downtime: Long export/import cycles can cause service interruptions.
  • Developer Fatigue: CLI errors, permission issues, rework — it adds up fast.

Want to skip the pain? Automate your migration with Estuary Flow. No coding, no downtime, no surprises.

Method 1: Migrate PostgreSQL to MongoDB with Estuary Flow

Estuary Flow is a data platform that lets you manage streaming data pipelines and connect to different data sources, whether it’s self-hosted or on the cloud. Over 30 different data connectors are supported, including MySQL, Firebase, Kafka, MariaDB, GitHub, MailChimp, and so on. PS: You can signup for a free account to get started.

With Estuary Flow, you don’t have to worry about CLI sign-ins, manual migration, or anything breaking. It fully automates pulling data from Postgres to MongoDB.

Step 1: Set up a Pipeline to Export Data from Postgres

  1. Set up an Estuary account -  Sign up or log in at estuary.dev.
  2. Access the Flow Dashboard - After logging in, go to the Estuary Flow dashboard.
  3. Create a New Capture.
    1. Navigate to the Sources tab.
    2. Click + NEW CAPTURE.
    3. Search for PostgreSQL and select it.
new capture - postgres to mongodb
Fig 1 - Viewing the Capture section of the dashboard
Select capture -  postgres to mongodb
Fig2 - Selecting Postgres Capture
  1. Give your capture a name and description.
 Create Capture - postgres to mongodb
Fig 3 - Creating a Capture
  1. Click on Endpoint config.
  2. Open the Azure resource.
  3. Copy the server name as the server address.
  4. Type in the Admin username and password.
  5. Add the name of the database and click Next.

    NB: Be sure to allow Estuary IP to access Azure service and the Replication is set to Logical.
  6. Click on Output Collections and select the table name, now stored as a collection.
  7. Click Save and publish.

Step 2: Set up a pipeline to import data to MongoDB using Estuary Flow

  1. Click on Materializations.
  2. Select New materialization.
new materialization - postgres to mongodb
Fig 4 - Viewing the Materialization on the Dashboard
  1. Search for MongoDB and select Materialize.
  2. Give a name to the Materialization and add the details.
Create Materialization - postgres to mongodb
Fig 5 - Creating a Materialization
  1. Click on Endpoint Config.
  2. Input the server address (host address). To get it, open MongoDB, and select your cluster. Click on metrics and you’ll see this. Copy the address at the top of the chart. Your data should be in this format.
Copy Server address MongoDB -  postgres to mongodb
Fig 6 - Copying the host address
  1. Enter your role username and password. Click on database access. Be sure that your role allows you to read and write to the database.
  2. Input the name of the database you want to write to.
  3. Add IP address 34.121.207.128 (or Estuary’s current IPs) to the MongoDB Atlas IP allowlist.
  4. Click Next.
  5. Click Save and Publish.
Connection Success - postgres to mongodb
Fig 7 - Connection Success Screen

If you see the Success screen, congratulations! Your PostgreSQL data is now flowing into MongoDB automatically.

  • Your Postgres tables are collections in MongoDB.
  • Future changes in Postgres (inserts, updates, deletes) will sync in real time to MongoDB if you keep the pipeline running.

Data Integration - start a free trial of Estuary Flow for seamless data transfer

Method 2: Migrate from Postgres to MongoDB using TSV

TSV stands for Tab-Separated Value. With TSV, you can only export data with a simple schema, i.e., data that doesn’t contain documents or complex associations. To do this, we need to export the data from Azure Postgres before we can import it to MongoDB. Whether you’re using Azure Postgres or any other type, you need an existing database you’d like to migrate. 

Export the data to TSV

  1. Connect to your Azure Postgres DB using the CLI. You can use Azure CLI in the Postgres resource you have created or work from your command line.
  2. Ensure to change the <host>, <user>, <databaseName>, <tableName>, <filename> to what you have.  

    In the CLI, type:  psql -h <host> -p "5432" -U <user> -d <databaseName> -c "\COPY (SELECT * FROM <tableName>) TO <filename, DELIMITERtsv> ',' CSV"

    Example: psql -h "my-postgresql-server.postgres.database.azure.com" -p "5432" -U "postgre_user@my-postgresql-server" -d "databaseName" -c "\COPY (SELECT * FROM tableName) TO /Users/morteza/file.csv DELIMITER ',' CSV"

    You may run into Permission errors while copying data to the CSV file. The code above works, but you can refer to this guide if you face any issues.
  3. Download the data to your device.

Import data to MongoDB

  1.  You need to have a  MongoDB Cluster created. Make sure that you have Read and Write access as a user (check that in Database Access). 

    Change <mongodb_user>, <mongodb_password>, <altas-cluster>,  and <DATABASE> to the correct values. 

    You’ll need to install mongoimport. Open up your terminal and type: mongoimport --uri mongodb+srv://<mongodb_user>:<mongodb_password>@<altas-cluster>.mongodb.net/<DATABASE>  --collection users --type tsv --file users.tsv --headerline --columnsHaveTypes

Method 3: Migrating from Postgres to MongoDB using JSON

JSON allows for a more complex schema, which is great for a schema that contains documents. Moving data from Postgres to Mongo DB with JSON is similar to TSV but gives more flexibility. You can also add data from other tables.

Export the data to JSON

  1. Connect to your Azure Postgres DB using the CLI.
  2. Make sure you change the <host>, <databaseName>, <tableName>, and <filename>.  

    psql -h "my-postgresql-server.postgres.database.azure.com" -p "5432" -U "postgre_user@my-postgresql-server" -d "databaseName" -c "\COPY (SELECT * FROM tableName) TO file.json WITH (FORMAT text, HEADER FALSE)"
  3. Download data to your device.

Import data to MongoDB

  1. You need to create a  MongoDB Cluster. As a user, make sure that you have Read and Write access (check that in Database Access). 

    You’ll need to install mongoimport to import the file into the MongoDB cluster. Open up your terminal and type:

    mongoimport --uri mongodb+srv://<mongodb_user>:<mongodb_password>@<atlas-cluster>.mognodb.net/<DATABASE> --collection orders --jsonArray orders.json

Side-by-Side Comparison: Estuary Flow vs Manual Methods

FeatureTSVJSONEstuary Flow
Simple SchemaYesYesYes
Complex SchemaNoYesYes
Handles Schema ChangesNoNoYes
Real-Time SyncNoNoYes
Manual WorkYesYesNo
Downtime RiskYesYesNo
CostLowLowFree Tier Available

Conclusion

Big ups for making it this far. In this read, we discussed 3 methods of moving data from Postgres to MongoDB. Two manual processes and one no-code platform. For data with more complex schema, you can use JSON format as compared to TSV for exporting your data. Automated pipelines like Estuary Flow make it easy to connect to various data sources and manage your data easily while choosing whether to stream in real-time or in batches. 

You should also consider your data demands, changing data structures, and organization development preferences before choosing a platform to migrate to. If you’re looking for cost-effective platforms, you can try Postgres compared to MongoDB as it requires you to pay for hosting. 

You’ve also seen how to build Postgres to MongoDB automation pipeline with Estuary Flow. It’s free to try out Estuary Flow and build pipelines to and from a variety of data systems.

Resources and References: 

 

Start streaming your data for free

Build a Pipeline
Share this article

Table of Contents

Start Building For Free

About the author

Picture of Jeffrey Richman
Jeffrey Richman

With over 15 years in data engineering, a seasoned expert in driving growth for early-stage data companies, focusing on strategies that attract customers and users. Extensive writing provides insights to help companies scale efficiently and effectively in an evolving data landscape.

Popular Articles

Streaming Pipelines.
Simple to Deploy.
Simply Priced.
$0.50/GB of data moved + $.14/connector/hour;
50% less than competing ETL/ELT solutions;
<100ms latency on streaming sinks/sources.