Want to dive deeper into your website's search performance data? Google Search Console (GSC) is a great starting point, but for advanced analysis and custom reports, you need the power of BigQuery.
This article will guide you through two options for moving your data from Google search console to BigQuery. Transferring your Google Search Console data to BigQuery is easy with these two methods: Estuary Flow (no-code, minutes) and Bulk Data Export (Google's built-in tool).
Let's explore how to seamlessly integrate these platforms for enhanced data analysis and optimization.
What is Google Search Console?
Image Source
Google Search Console is a free platform designed to monitor and optimize the organic presence of your website. You can utilize this tool to gain valuable insights about your website’s performance, most viewed pages, impressions, and keywords.
GSC provides a range of helpful reports, including:
- Page Indexing Report: See which of your website's URLs are indexed by Google and troubleshoot any indexing issues.
- Sitemaps Report: Monitor the status of your sitemaps and ensure Google can discover your website's content.
- Performance Report (Search): Analyze your website's search traffic, including impressions, clicks, and average position for different queries.
While GSC offers valuable data, its built-in analysis capabilities are limited. To fully leverage this data and unlock deeper insights, it's best to load your datasets into a powerful cloud warehouse like BigQuery.
What is Google BigQuery?
BigQuery is an enterprise-grade, fully managed cloud data warehouse. It has a wide range of built-in functionalities, including machine learning, geospatial analysis, and business intelligence capabilities.
Data is stored in a columnar format for enhanced query performance. You can use ANSI SQL queries to perform analytical and aggregation functions on your data. This ensures BigQuery’s infrastructure is optimized for conducting intricate data analysis so you don’t have to manage the complexities of resource allocation.
Let’s look at some of the key features that BigQuery provides:
- Machine Learning: You can create and run machine learning models within your database through GoogleSQL queries. The BigQuery ML functionality can be used through:
- Google Cloud Console
- BigQuery’s REST API
- BigQuery’s Integrated Colab Enterprise notebooks
- Command-line tool bq
- External tools such as Jupyter notebook
- BigQuery BI Engine: The BigQuery BI Engine can accelerate SQL queries from various data visualization tools and allows clustering and partitioning techniques on large tables in your data.
- Third-Party Visualization Tools: You can easily integrate business intelligence tools in BigQuery to boost the performance of your projects. BigQuery supports third-party visualization tools like PowerBI, Tableau, and Looker Studio, which help you create interactive dashboards and reports for quick ingestion and distribution of data.
- BigQuery Sandbox: If you want to try BigQuery before making any purchases, you can use the Sandbox tool. This free tool has many of the features that BigQuery offers, allowing you to conduct testing and experimentation for smaller projects.
Methods to Load Data from Google Search Console to BigQuery
Now that we’ve looked at some of the functionalities of Google Search Console and BigQuery, let’s dive into the practical methods for transferring your data. Here are two easy-to-follow methods:
Method 1: Move Data from Google Search Console to BigQuery Using Estuary Flow
If you need your GSC data in BigQuery fast and without writing any code, Estuary Flow simplifies GSC data migration. This no-code ETL tool simplifies the data transfer process with pre-built connectors.
Estuary Flow has a cloud-native design that makes use of connectors to migrate data from the source to the destination.
To load data from Google Search Console to BigQuery through Estuary Flow, create a free account or log in to your Estuary Flow account and follow these steps:
Step 1: Set up Google Search Console as the Source
- On the Estuary Flow dashboard, click the Sources tab, which is located on the left side navigation bar.
- On the Sources page, click the + NEW CAPTURE button.
- Type Google Search Console in the Search connectors box.
- Once you find the connector, click the Capture button to set it as your source.
- On the Create Capture page, under the Capture Details section, provide a unique Name.
- For the Endpoint Config, provide the Start Date. It is also recommended that you fill in the End Date and Custom Reports fields.
- There are two ways to authenticate your Google Search Console account. You can either use OAuth2 or manually authenticate by generating a Service Account Key Authentication.
- Once you have correctly filled the fields, click NEXT, and then SAVE AND PUBLISH.
Step 2: Set up BigQuery as the Destination
- Go back to the Estuary dashboard and click the Destinations tab.
- On the Destinations page, click the + NEW MATERIALIZATION button.
- Type BigQuery in the Search connectors box.
- Once you find the connector, click the Materialization button to set it as your destination.
- On the Create Materialization page, fill in the mandatory fields under Materialization Details and Endpoint Config. These include a unique Name, Project ID, Service Account JSON, Region, Dataset, and Bucket.
- If the data captured from Google Search Console was not filled in automatically, you can add it by clicking the Link Capture button under the Source Collections section.
- Finally, click NEXT and then SAVE AND PUBLISH.
That's it! Your data pipeline is now live, and Estuary Flow will start moving your GSC data into BigQuery.
To learn more about the pipeline setup, you can refer to Estuary Flow’s detailed documentation:
Want to give it a try? Estuary Flow offers a free trial so you can experience seamless, real-time data transfer and unlock the full potential of your GSC data in BigQuery. Sign up today!
Method 2: Using Bulk Data Export (Google's Method)
Bulk data export is a new feature from Google to help you move your data from Search Console to BigQuery. The export includes all performance data, excluding anonymized queries that are filtered out for privacy reasons. Here are the steps required to start a new bulk data export:
Prerequisites
- Set up a Google Cloud project with billing.
- Enable API and BigQuery Storage API by visiting the APIs & Services tab in the Cloud Console sidebar.
Step 1: Configure your Google Cloud Console Account
- Open your Google Cloud Console and choose the Google Cloud project where you want to export the data.
- Go back to the sidebar and click on the IAM and Admin tab.
- A new page that displays Permissions for your project will open.
- Click + GRANT ACCESS, which will then open a side panel, Ad Principals.
- Under the New Principals section, paste the service account name:
search-console-data-export@system.gserviceaccount.com
- Grant this account two roles: BigQuery Job User and BigQuery Data Editor.
- Confirm all the changes by clicking Save.
Step 2: Configure your Google Search Console
- After setting up your Google Cloud project, open your Google Search Console account.
- Click on Settings and go to Bulk Data Export.
- Only copy the project ID for your Google Cloud Console project in the Cloud project ID field.
- Choose a suitable name for the dataset you are migrating. Remember, all dataset names must start with the string searchconsole.
- Select the location you want for your dataset, but be aware that you cannot change it once your data export is underway.
- Click Continue to begin the scheduled data exports.
Image Source
The first data export will occur within 48 hours after you successfully configure the Search Console with BigQuery. If there are non-persistent errors, such as dataset location mismatch or missing permissions in the Cloud project, the migration will be pushed to the following day. You cannot change your dataset’s schema during or after the migration process because that may cause the export to fail.
Which Method is Right for You?
Method | Speed | Technical Skill | Real-time Updates |
---|---|---|---|
Estuary Flow | Fastest | None | Yes |
Bulk Data Export | Slower | Minimal | No |
Takeaways
Google Search Console offers powerful insights for free, and BigQuery’s pay-as-you-go model makes it a cost-effective choice. These combined capabilities enable you to enhance your website’s performance while taking advantage of robust features.
In this article, we explored two methods of moving data from Search Console to BigQuery. However, if you want to save time and capture data changes in real time, Estuary Flow is the ideal choice. All materialization destinations with Estuary Flow are equipped to receive and update changes in your databases without any re-syncing. This gives you accurate information almost instantly, allowing you to make sound decisions for your business.
Sign up for Estuary Flow today and transfer your data from Google Search Console to BigQuery in just a few minutes!
Discover more insights on integrating Google Search Console data with other destinations:
FAQs
How to connect the Google Search Console to BigQuery?
To connect the Google Search Console to BigQuery:
- Open Google Cloud Console and enable the BigQuery API by clicking +ENABLE APIS AND SERVICES.
- Grant permissions to Search Console using the +GRANT ACCESS option.
- Mention the service account name, and click Save to dump data in your project.
- In Google Search Console, Set up a Google Cloud Project and configure the data export to BiQuery.
How do I export data from Google Search Console?
To export data from Google Search Console (GSC):
- Log in to your GSC account and navigate to the report you want to export.
- Click the Export button at the top right corner of the report page. Choose the preferred export format, such as CSV, Excel, or Google Sheets, to download the data directly to your system.
How to access BigQuery using the console?
To access BigQuery using the console:
- Open Google Cloud Console and sign in with your Google account.
- In the console toolbar, click the Navigation menu and scroll to the Analytics section.
- Select BigQuery
About the author
With over 15 years in data engineering, a seasoned expert in driving growth for early-stage data companies, focusing on strategies that attract customers and users. Extensive writing provides insights to help companies scale efficiently and effectively in an evolving data landscape.