Estuary

How to Stream Oracle Data to Microsoft Fabric with Estuary Flow

Learn how to stream Oracle data to Microsoft Fabric in real-time with Estuary Flow. Set up a low-latency, no-code data pipeline for faster insights and analytics.

Blog post hero image
Share this article

OracleDB, a popular RDBMS, is widely used for varied business applications owing to its reliability and efficiency. However, some of its shortcomings include its high licensing costs, complexity, and the need for additional hardware for heavy workloads.

To perform cost-effective, real-time analytics, you can stream your data from Oracle to Microsoft Fabric. This user-friendly, end-to-end analytics platform facilitates centralized data storage and embeds AI capabilities for transforming your data into actionable insights.

Estuary Flow is an excellent choice for an effortless Oracle-to-Fabric integration. You can build low-latency, real-time pipelines to stream data from the desired sources to the selected destination. Let’s look into the details of how to stream Oracle data to Microsoft Fabric using Estuary Flow, a no-code, real-time CDC platform.

Why Stream Oracle Data to Microsoft Fabric?

Microsoft Fabric overview
  • Cost-Efficiency: Microsoft Fabric is a comprehensive analytics service. It offers a unified platform with multiple services such as Power BI, data science, data engineering, and data factory. This allows your business to avoid the complexity of licensing each service individually; you only need a single licensing requirement.
  • AI-Enhanced Analytics: The AI-enhanced toolset in Microsoft Fabric—Copilot—can assist across various workloads. Whether you’re a data engineer, data scientist, or business analyst, Copilot can help you streamline workflows, generate insights, and create visualizations for your Oracle data.
  • Unified Data Lake: You can stream your Oracle data into Microsoft Fabric’s OneLake and combine it with your data from other sources. This helps simplify data management and analytics. OneLake ensures easy data discovery, sharing, and uniform enforcement of policy and security settings.
  • Real-Time Intelligence: Microsoft Fabric offers real-time intelligence as an end-to-end solution for streaming data, event-driven scenarios, and data logs. This supports streaming data analysis of your Oracle data for immediate decision-making.

For details on Microsoft Fabric, its features, and competitors, see Microsoft Fabric: A Unified Data Platform with Power BI.

See how Microsoft Fabric unifies your data and how Estuary Flow simplifies integration in this quick video before we dive into the step-by-step guide.

Step-by-Step Guide: Streaming Oracle to Microsoft Fabric

Let’s look into how you can use Estuary Flow to ingest streaming data from Oracle to Fabric without extra ETL steps.

Prerequisites

Before getting started, ensure that the following prerequisites are in place for a smooth integration:

For Oracle:

  • Oracle 11g or above.
  • Permit Estuary Flow to connect to your Oracle database (if they exist in separate VPCs).
  • A dedicated read-only Estuary Flow user with access to Oracle tables for replication.

For Fabric:

  • A Fabric Warehouse connection string.
  • The name of a storage account and its container for storing temporary staging files to load into the warehouse.
  • A key for the storage account.
  • A service principal that can connect to the warehouse, along with its Client ID and Client Secret.

Step 1: Configure Oracle as the Source

  • Sign in to your Estuary account.
  • On the dashboard, select the Sources option from the left pane.
  • Click the + NEW CAPTURE button.
Searching for Oracle in the Estuary dashboard
  • On the Create Capture page, use the Search connectors field to find the Oracle connector.
  • Among the available Oracle Database connectors (Real-time, Batch, and Flashback), select the Real-time connector. Click the connector’s Capture button to proceed.
Oracle capture configuration
  • On the connector configuration page, specify mandatory details, including:
    • Name: A unique name for your capture.
    • Data Plane: The data plane you’d like to use.
    • Server Address: The host or host:port at which you can access the database.
    • User: The database user for authentication.
    • Password: The password corresponding to the specified database user.
    • Database: The logical database name to capture data from.
  • After providing all the required details, click the NEXT button followed by SAVE AND PUBLISH.

Following the connector’s configuration, it will capture data from OracleDB into a Flow collection using Oracle Logminer. For additional connector details, see Estuary’s OracleDB documentation.

Step 2: Configure Microsoft Fabric as the Destination

After a successful capture from OracleDB, you will see a pop-up window summarizing the capture details. To proceed with configuring the destination end of the pipeline, you can:

  • Click the MATERIALIZE COLLECTIONS button on the pop-up window, OR
  • Select the Destinations option from the left-side pane of the dashboard and click the + NEW MATERIALIZATION button.

When you’re redirected to the Create Materialization page, here are the next set of steps to follow:

Searching for Microsoft Fabric in the Estuary dashboard
  • Use the Search connectors box to find the Azure Fabric Warehouse connector.
  • The connector will appear in the search results; click its Materialization button.
Microsoft Fabric materialization configuration
  • On the connector configuration page, specify the mandatory details, including:
    • Name: A unique name for your materialization.
    • Data Plane: The data plane you’d like to use. Currently, a materialization’s data plane must be the same as that of its source systems.
    • Client ID: The client ID for the service principal to connect to the warehouse.
    • Client Secret: The client secret for the service principal to connect to the warehouse.
    • Warehouse: The Azure Fabric Warehouse name to connect to.
    • Schema: The schema for bound collection tables and associated metadata materialization tables.
    • Connection String: The warehouse SQL connection string.
    • Storage Account Name: Name of the storage account to write temporary files.
    • Storage Account Key: The key for the specified storage account.
    • Storage Account Container Name: Name of the storage account’s container.
  • If the collections added to your capture aren’t automatically added to your materialization, you can do so manually. To do this, click the SOURCE FROM CAPTURE button in the Source Collections section and select the Oracle DB capture to link to your materialization.
  • Finally, click NEXT > SAVE AND PUBLISH.

Once configured, the connector will materialize Flow collections of your OracleDB data into tables in Microsoft Azure Fabric Warehouse. Hint: since setup for all of the related Azure resources can be somewhat complex, we created a guide specifically to cover Fabric configuration if you run into any difficulty.

Ready to power your analytics with real-time Oracle-to-Fabric streaming? Try Estuary Flow today or schedule a consultation.

Why Estuary Flow?

Oracle CDC (Change Data Capture), or streaming Oracle data, enables real-time analytics and helps maintain data consistency across systems.

Among the different Oracle CDC methods are Oracle GoldenGate, Oracle XStream, and OracleAQ. However, there are certain drawbacks associated with these methods, including:

  • Increased Costs: If you’re a small-to-medium business, Oracle services, particularly GoldenGate, may not be the optimal choice since they are more expensive. For additional scalability and flexibility, these services require further investments in infrastructure, services, and licenses.
  • Technical Complexity: Most of these Oracle CDC services are complex and involve a significant learning curve. Implementation and maintenance require specialized skills and expertise, which, in turn, results in increased costs.
  • Resource-Intensive: With high loads, these services can consume considerable system resources, requiring dedicated infrastructure for optimal performance.

Estuary Flow is an efficient, easy-to-use data integration platform with real-time ETL capabilities; it is a good alternative to the above-mentioned Oracle CDC methods. This platform allows you to consolidate data from multiple sources into a centralized destination for further analysis. It facilitates the elimination of data silos and streamlining of workflows.

Streaming options

Let’s look into some of the features that make Estuary Flow a reliable Oracle CDC solution:

  • A Wide Range of Easily Configurable Connectors: Estuary Flow offers about 200+ no-code batch and streaming connectors that you can use to extract data from sources and load it into a destination. From data lakes, data warehouses, and analytics platforms to CRMs, social media platforms, and apps, various connectors are available.
  • Change Data Capture (CDC): Our flagship feature—CDC—allows you to track and synchronize changes in the source system data and replicate these changes to the target system in real-time. With CDC, you can connect to a system and start reading a stream immediately while also capturing its (24-hour) history. The destination will receive this combined stream in real-time with a sub-100ms latency.
  • Secure Data Transfers: Estuary Flow ensures your data and systems are secure with air-tight compliance standards. The platform never stores your data; it just helps move it. With SOC 2 Type II certification and compliance with regulatory standards such as GDPR, HIPAA, CCPA, and CPRA, your data is guaranteed to be safe.
  • Scalability: With a 7+ GB/s throughput capacity, Estuary Flow provides a scalable service for your fluctuating workloads. Built to be horizontally scalable, the platform is well-suited for small and large enterprises.
  • Supports ETL and ELT: With Estuary Flow, you can merge data from multiple sources and transform it before loading it into the data warehouse (ETL), after (ELT), or both (ETLT). For streaming or batch transforms, you can use SQL or TypeScript (ETL) and dbt (ELT).
  • Multiple Deployment Options: Among the different deployment options available with Estuary is private deployment. You can deploy Estuary in your private network. This ensures that your data is always in your control. Other deployment modes include public deployment and BYOC (Bring Your Own Cloud).

Use Cases

Streaming data from Oracle to Fabric serves multiple purposes; some popular use cases include:

Financial Services

OracleDB is typically utilized to manage customer accounts, transactions, and market data. It is also useful for regulatory compliance in financial services like banking, insurance, and investment firms.

By streaming such financial data from Oracle to Microsoft Fabric, you can perform real-time fraud detection for prompt actions. You can also monitor stock prices for real-time forecasting with Fabric’s AI service. This allows you to execute trades based on immediate market fluctuations.

Healthcare

Streaming healthcare data such as patient records, medical imaging records, and prescription management systems from Oracle to Fabric can help in many ways. Fabric’s support for healthcare solutions facilitates:

  • Patient monitoring in critical care.
  • Early detection of diseases or health issues.
  • Real-time analysis of medical imaging data.

Supply Chain Management

When you stream data on inventory, sales, and point-of-sale systems from Oracle to Microsoft Fabric, you can gain real-time inventory and demand forecasts. Fabric’s AI models can help forecast demand and optimize stock levels. This can benefit retail as well as e-commerce.

Personalized Recommendations for Customers

By streaming customer profiles, purchase history, and shopping cart data to Fabric, you can:

  • Gather real-time customer insights.
  • Personalize product recommendations for customers.
  • Segment your customers in real-time for marketing campaigns.

Conclusion

An Oracle to Fabric integration can be beneficial in terms of cost-efficiency, AI-enhanced analytics, and real-time intelligence. To stream Oracle data to Microsoft Fabric, you can use Estuary Flow. With its intuitive interface, readily available Oracle and Microsoft Fabric connectors, and streaming capabilities, Estuary Flow simplifies the process.

You can stream Oracle to Fabric for fraud detection, analysis of medical imaging data, inventory management, and personalized marketing, all in real time. Fabric helps you meet various operational requirements with its real-time analytical offerings, Copilot, and OneLake.

Looking for an easy-to-use integration solution to centralize data from multiple sources for effective analytics? With Estuary Flow, all it’ll take are a few minutes and fewer clicks to set up your integration pipeline.

FAQs

What is mirroring in Fabric?

Mirroring in Fabric is a low-cost and low-latency solution that allows you to bring data from various systems together into a centralized analytics platform. You can use this to replicate your existing data from various Azure databases and external data sources into Fabric’s OneLake.

Does Oracle work with Microsoft?

Oracle’s partnership with Microsoft provides more choices for multi-cloud architecture. If you’re an Azure customer, you can get an OCI-in-Azure-like experience; you can procure, deploy, and use Oracle database services running an OCI within the native Azure portal and APIs.

Start streaming your data for free

Build a Pipeline
Share this article

Table of Contents

Start Building For Free

About the author

Picture of Dani Pálma
Dani PálmaHead of Data Engineering Marketing

Dani is a data professional with a rich background in data engineering and real-time data platforms. At Estuary, Daniel focuses on promoting cutting-edge streaming solutions, helping to bridge the gap between technical innovation and developer adoption. With deep expertise in cloud-native and streaming technologies, Dani has successfully supported startups and enterprises in building robust data solutions.

Popular Articles

Streaming Pipelines.
Simple to Deploy.
Simply Priced.
$0.50/GB of data moved + $.14/connector/hour;
50% less than competing ETL/ELT solutions;
<100ms latency on streaming sinks/sources.