Are you looking for a seamless solution to transfer data from Intercom to BigQuery for enhanced analytics? Look no further. 

In this comprehensive guide, we will walk you through two reliable methods that lay the groundwork for integrating data with BigQuery. This will allow you to unlock the full potential of your data through informed decision-making and transformative analytics.

Before we dive in, let's have a closer look at what these platforms have to offer for those who want to know more about connecting Intercom and BigQuery.

What is Intercom?

Intercom serves as a versatile and dynamic messaging platform that enables businesses to effectively communicate with both potential and current customers. Through its customizable messaging apps, Intercom provides a wide array of features and functionalities that facilitate seamless interactions. From easy collaboration to automated campaigns and real-time metrics, Intercom has become a valuable asset for businesses aiming to improve customer satisfaction and drive growth.

Key Features and Functionalities of Intercom

Automated Campaigns: Intercom's automated campaigns feature empowers businesses to create and execute targeted messaging campaigns based on specific triggers and events. By automating the delivery of messages, businesses can engage with their users at the right moment, improving user engagement, retention, and conversion rates. This feature saves time and effort by allowing businesses to set up campaigns once and let Intercom handle the rest.

Live Chat and Conversational Support: The live chat functionality offered by Intercom enables businesses to provide real-time support to their website visitors. This feature allows for immediate interaction and resolution of customer queries or issues, leading to enhanced customer satisfaction. With features like chat routing and agent collision detection, businesses can efficiently manage customer conversations and ensure a seamless support experience.

In-App Messaging: Intercom's in-app messaging feature enables businesses to communicate with their users directly within their mobile or web applications. This enables contextual and personalized messages, such as product updates, onboarding tips, or personalized recommendations, enhancing the overall user experience. By delivering relevant messages within the app, businesses can drive user engagement, promote feature adoption, and encourage user retention.

What is BigQuery?

BigQuery is a cloud-based data warehouse that allows you to store and analyze large amounts of data very quickly. It is a fully managed service, which means that Google takes care of all the underlying infrastructure, so you can focus on your data analysis. BigQuery uses SQL-like syntax, so it is easy to use for anyone familiar with SQL.

Key Features and Functionalities of BigQuery

Lightning-fast and highly scalable: BigQuery is purpose-built to handle massive datasets efficiently. It can process petabytes of data within seconds and seamlessly scales to accommodate growing data requirements.

SQL-like simplicity: With its SQL-like syntax, BigQuery provides an intuitive and familiar language for data analysis. Users experienced in SQL can quickly leverage their existing skills and tap into a wide range of tools and libraries, making data analysis more accessible and powerful.

User-friendly interface: BigQuery boasts a user-friendly interface that simplifies tasks such as data loading, query execution, and result visualization. Its intuitive design streamlines the data analysis workflow, enabling users to work efficiently and effectively. Additionally, the BigQuery API offers automation capabilities for streamlined data analysis tasks.

Fully managed service: As a fully managed service, BigQuery removes the burden of infrastructure management. Google takes care of all the underlying infrastructure, ensuring optimal performance, availability, and security. This allows users to focus solely on extracting insights from their data without the hassle of maintaining complex infrastructure.

How to Connect Intercom to BigQuery

There are two common approaches for connecting Intercom to BigQuery and seamlessly loading data:

Method 1: Using ETL scripts tailored to move data from Intercom to BigQuery

Method 2: Using SaaS tools like Estuary Flow to connect Intercom to BigQuery

Method 1: Using ETL Scripts Tailored to Move Data From Intercom to BigQuery

ETL scripts are a set of instructions that are used to extract data from one source, transform it, and load it into another source. When connecting Intercom and BigQuery, the ETL scripts would extract data from Intercom, transform it into a format compatible with BigQuery, and load it into BigQuery. Intercom has a wide range of APIs for data extraction from Intercom.

Now, let’s look at the manual and custom procedure for extracting data from Intercom:

Step 1: Create an Application in the Intercom Developer Hub

First, create an application in the Developer Hub and get an access token. To obtain your access token once your application has been created, go to Configure > Authentication section, then click on Copy to clipboard whenever you are ready to use the token as below.

Blog Post Image

Step 2: Use Intercom API to Extract Data

To retrieve data from Intercom, you need to make API calls to Intercom's REST API. You can use tools like Postman or Curl to accomplish this task. Intercom offers a range of endpoints that allow you to extract various types of data, including conversations, tags, visitors, and more. The data you retrieve from these endpoints is presented in JSON format.

To illustrate, here is an example of a request to extract data for a specific Intercom conversation:

plaintext
curl --request GET \ --url https://api.intercom.io/conversations/id \ --header 'accept: application/json' \ --header 'authorization: Bearer token'

 

Upon making this request, the response will contain the requested conversation data in JSON format like this:

plaintext
{ "type": "conversation", "id": "147", "created_at": 1400850973, "updated_at": 1400857494, "conversation_message": { "type": "conversation_message", "subject": "", "body": " Hi Alice, nn We noticed you using our product. Do you have any questions? n - Virdiana ", "author": { "type": "admin", "id": "25" }, "attachments": [ { "name": "signature", "url": "http://example.org/signature.jpg" } ] }, "user": { "type": "user", "id": "536e564f316c83104c000020" }, "assignee": { "type": "admin", "id": "25" }, "open": true, "read": true, "conversation_parts": { "type": "conversation_part.list", "conversation_parts": [ //... List of conversation parts ] }, "tags": { "type": 'tag.list', "tags": [] } } }

 

You can read more about other APIs endpoints provided by Intercom here.

Step 3: Data Preparation

During the data preparation stage, there are several considerations to take into account:

Schema Creation: It is good to define a schema for the tables in your BigQuery Data Warehouse that will receive the specific data endpoints imported from Intercom. This schema ensures proper organization and structure of the data within BigQuery.

Data Type Alignment: It is important to ensure that the data types in your Intercom data align with the corresponding data types in BigQuery. BigQuery supports a wide range of data types, allowing for flexibility in handling different kinds of data. You can refer to the BigQuery documentation to explore the supported data types.

JSON Flattening (Optional): Depending on your specific use case, you may consider flattening the JSON data. Flattening involves transforming nested JSON structures into a tabular format, which can be beneficial for certain analysis and querying scenarios.

Step 4: Data Loading into BigQuery

There are multiple methods available to load data into BigQuery. Here are two commonly used approaches:

Option 1: BigQuery's Web UI

This approach involves manually loading data through the BigQuery web user interface. While it requires manual intervention, it is relatively straightforward and suitable for smaller datasets or one-time uploads.

Option 2: Loading via Google Cloud Storage and bq Commands

An alternative approach is to load data into Google Cloud Storage first and then leverage bq commands to transfer the data from Cloud Storage into BigQuery. This method is useful for handling larger datasets or automating the data-loading process.

You can choose the data loading approach that best fits your requirements, taking into account factors such as data size, frequency of updates, and automation needs.

Drawbacks of Migrating From Intercom to BigQuery Natively

Time-consuming and complex: The process of migrating data from Intercom to BigQuery natively requires a lot of coding and manual intervention. This can be a time-consuming and complex process, especially for large datasets.

Limited data transformation capabilities: BigQuery is a powerful data warehouse, but it does not have as many built-in data transformation capabilities as some other data warehouses. This means that you may need to write custom code to transform your data in order to load it into BigQuery.

Difficult to achieve real-time data loading: If you need to load data from Intercom into BigQuery in real time, you will need to write custom code and configure cron jobs. This can be a complex and error-prone process.

Less flexibility and control: BigQuery uses a "black box" serverless architecture, which means that you have less flexibility and control over your settings than you would with some other data warehouses. This can be a disadvantage if you need to have specific control over your data processing.

More expensive than some other data warehouses: BigQuery is a pay-as-you-go data warehouse, which means that you only pay for the data that you use. However, BigQuery can be more expensive than some other data warehouses, especially for large datasets.

Method 2: Using No-Code Tools Like Estuary Flow to Connect Intercom to BigQuery

Blog Post Image

Estuary Flow is a state-of-the-art cloud-based data integration platform that revolutionizes the processing of streaming data in real time. Its core strengths lie in its ability to synchronize data instantly, ensuring that your information is always up-to-date for making informed decisions based on the latest insights.

Estuary Flow's low-latency, Change Data Capture (CDC) capabilities empower businesses to capture and synchronize real-time data changes seamlessly. CDC is a critical component of Estuary Flow's data integration framework, enabling the identification of modifications in source data systems and efficiently propagating these changes to target destinations.

The platform also empowers users to effortlessly build custom data pipelines through its intuitive user interface, allowing seamless integration of data from any source.

The best part is that Estuary Flow is a no-code platform, eliminating the need for coding knowledge and enabling users to simply select their data sources and collections for a hassle-free experience. Flow offers unparalleled scalability, effortlessly accommodating the needs of both small businesses and large enterprises, making it an ideal choice for organizations of any size.

Having gained an understanding of Flow's robust capabilities, let's delve into the seamless integration of Intercom with BigQuery using Flow. The process is designed to be straightforward and easy to follow, enabling you to establish a reliable connection between the two platforms effortlessly.

Prerequisites

Before you connect Intercom to BigQuery, there are a few prerequisites to consider:

  • To authenticate with Intercom, you can sign in directly with the Flow web app. You'll need the username and password associated with a user with full permissions on your Intercom workspace.
  • You can also configure authentication manually using the flowctl CLI. You'll need the access token for your Intercom account.
  • A new Google Cloud Storage bucket in the same region as the BigQuery destination dataset.
  • A Google Cloud service account with a key file generated and the roles specified in the documentation.

Step 1: Capture Data from Intercom

To capture data from Intercom using Estuary Flow, follow these steps:

  1. Log in to your Estuary Flow account or sign up for free to access the platform.
  2. Navigate to the Sources tab and click on "New Capture" to initiate the data capture process.
  3. Search and or select Intercom as the designated data source for your capture.
  4. Provide a unique name for your capture, ensuring it accurately represents the data you intend to capture.
  5. Using your Intercom API access token, authenticate your Intercom account within Estuary Flow to establish a secure connection between the two platforms.
  6. Click “Next” to initiate the connection between Flow and Intercom. All the tables Flow finds will be added as data collections.

    Blog Post Image


  7. Click “Save and Publish” to begin ingesting data from Airtable. A pop-up window notifies you when the publishing process is complete.

Step 2: Materialize data to BigQuery

To materialize data to BigQuery:

  1. Choose BigQuery as the data destination.
  2. Give your materialization a name.
  3. Fill in the following fields:
    1. Project ID: Your Google Cloud Platform project ID.
    2. Service Account Key: The JSON key file for your service account.
    3. Region: The region where your BigQuery dataset is located.
    4. Dataset name: The name of your BigQuery dataset.
    5. GCS Bucket name: The name of your Google Cloud Storage bucket.
  4. Add source collections to materialize.
Blog Post Image
  1. Click "Next" to initiate the connection between Flow and BigQuery. Flow will establish a seamless connection, enabling the migration of captured collections from Intercom to correspond to BigQuery tables. You have the opportunity to review and modify the mappings using the Source Collections feature to ensure data alignment.
  2. Then click "Save and Publish" to complete the process of migrating your existing data from Airtable to BigQuery tables.

Any updates made in Intercom will be automatically synchronized by Flow, ensuring a continuous and synchronized flow of data to your BigQuery project.

Conclusion

With the methods outlined in this article, you now have the tools to seamlessly transfer data from Intercom to BigQuery for enhanced analytics. Whether you opt for ETL scripts or use a no-code alternative like Estuary Flow, you can establish a secure and efficient connection between the two platforms. 

By combining the powerful features of Intercom, such as automated campaigns and live chat support, with the lightning-fast processing and scalability of BigQuery, you can gain comprehensive insights and make informed decisions to drive your business forward. 

Leverage Estuary Flow's capabilities to automate data pipelines and tap into your data's full potential. Get started by signing up for free, or get in touch with us for more information and assistance.

Start streaming your data for free

Build a Pipeline