Back to Blog

BigQuery Data Transfer Service Explained

One thing you can be certain of when working with Google apps is that data transfer between services will be as smooth as possible. Things are no different when importing the data into BigQuery, thanks to the BigQuery Data Transfer Service. 

This handy feature allows for automated data imports from services such as Google Ads, Google Play, Campaign Manager, and YouTube. What’s more, you can also plug in external cloud services and warehouses and update your BigQuery tables regularly as any changes occur at the source.

And, if any source is not covered, there’s a good chance third party tools can help to do the job. 

BigQuery Data Transfer Service – how does it work?

BigQuery Data Transfer Service loads data from data sources into your BQ tables. All imports happen on a pre-set schedule and are fully automated. What’s more, you can manually launch data backfills to cover any data losses incurred during outages.

You can access the Data Transfer Service via three available methods:

  • Google Cloud Console
  • The bq command-line tool
  • Data Transfer Service API

In terms of available sources, the majority include Google’s services. These are:

  • Google Ads
  • Google Ad Manager
  • Cloud Storage
  • Campaign Monitor
  • Google Play
  • Google Merchant Center
  • Search Ads 360
  • YouTube Channel reports
  • YouTube Content Owner reports

You can also migrate data from external sources – Amazon S3, Amazon Redshift, and Teradata.

Note: The Data Transfer Service does not provide the BigQuery data export. Check out the tool introduced in this section if this is what you’re looking for.

BigQuery Data Transfer Service – example

Let’s look at an example. We’ll set up a Data Transfer Service to pull data from our YouTube channel and into a BigQuery table.

To set it up in the Cloud Console, find BigQuery in the menu to the left. Then, select Data Transfers.

Click to enable BigQuery Data Transfer API. Then, click Create transfer.

Now it’s time to choose the source type from the drop-down list. We’ll go with the YouTube Channel.

The configuration differs based on the source you pick. For the YouTube channels, it looks like this:

You’ll need to choose a display name for each source and decide on the schedule for data imports. You can launch the first import right away or schedule it for a later time.

Then, pick the dataset to import the data to and decide on the table suffix. If you want, also enable email notifications so that Google sends you a message if a transfer fails at any point.

When finished, click Save. It will launch a popup asking for permission to access the source account. Allow it.

Once you do, the data import will automatically launch if you chose the Start Now option earlier. It may take some time depending on the size of the imported file. You’ll see the list of completed runs as well as those in progress. For each, you can see the logs where any errors will also be visible.

If you look back at your BigQuery dashboard, you’ll also see the newly-created tables resulting from the recent import.

If you’re missing any data, you can schedule a backfill. Go back to the Transfer Details window and click the Schedule Backfill button in the top-right corner. Then, decide whether BigQuery should run a one-time transfer or pull data from a specific date range.

BigQuery Data Transfer Service for other data sources

The list of available apps for pulling the data isn’t that long, so you may feel limited at some point. This is when third-party solutions come in handy. 

BigQuery users frequently use to pull data from popular apps, such as HubSpot, Pipedrive, Airtable, QuickBooks, or even Google Sheets. 

The list of available BigQuery integrations is much longer. Virtually anything else can be fetched with Coupler’s JSON to BigQuery importer. works in a very similar fashion to Google’s Data Transfer Service. The imports run automatically and take just a few minutes to set up. You can come up with your own schedule for data refresh and pull the new data into BigQuery tables as often as every 15 minutes.

At the same time, allows you to send data from Google BigQuery to Google Sheets or Excel.

To get started, create a account. Then, click the Add new importer button. 

Choose the Source app first. As an example, we’ll go with HubSpot.

Connect your account and choose what to import. When finished, jump into the Destination settings.

The available destinations are Microsoft Excel, Google Sheets, and BigQuery. Of course, we’ll go with the latter. Select it and press Connect.

To import into BigQuery, we’ll need to create a service account, generate new keys, and upload our individual JSON file. 

In the process, we’ll grant specific permissions to our service account to save the data into our project. BigQuery Data Transfer Service Agent role won’t be required in this case.

  • Head over to Google Cloud Platform Console. 
  • From the menu to the left, select IAM & Admin -> Service Accounts
  • On top, select +Create Service Account.
  • Type in the service account name and, optionally, a description.
  • Add roles BigQuery Data Editor and BigQuery Job User.
  • Press Done
  • As the next step, find your newly-created service account on the list. Click the three-dots to the right of it and select Manage Keys.
  • Press Add Key and then Create New Key.
  • Pick the JSON key type and press Create
  • Save the JSON file and upload it back on Click Save.
  • Type in the name of an existing dataset. Do the same for the table name. If it doesn’t exist yet, it will be created when the first importer runs.
  • Finish the Destination setup and jump to the Schedule tab. Choose when and how often the importer should run.
  • Then, press the Save and Run. Once the import is finished, press the View Results button to jump directly into your BigQuery table.

With ETL tool, you can also copy BigQuery tables to other datasets or projects.

Pricing for the BigQuery Data Transfer Service

Importing data from all the available sources is free of charge. The only exception is Google Play, where you’re charged $25 for each unique Package Name in the Installs_country table.

It doesn’t mean, though, that you can use BigQuery Data Transfer Service without any limitations. Each of your transfers falls under BigQuery load jobs limits. For example, you can run up to 100,000 jobs per day.

You can calculate your daily jobs by multiplying: 

  • # of transfers
  • # of tables (most apps create around 25 to 60 tables with each run)
  • Schedule frequency
  • Refresh window (# of days to include in each transfer)

Of course, standard fees apply for storing the data in BigQuery as well as querying it. Read more about BigQuery Cost.

What’s more, certain apps can incur costs for migrating the data out of their premises.

Final words

BigQuery Data Transfer Service is beneficial if you’re a heavy Google user, relying on different services and keeping a copy of all data in BigQuery tables. It’s no different for those of you in need of syncing data between warehouses. The functionality is free, reliable, and runs according to schedule as well as on-demand.

The limitation is that the transfers can run only in one direction – into BigQuery. If you need to sync this data further, you’ll need to resort to third-party solutions or native abilities of other services for importing BigQuery data. 

At this point, can pull BigQuery data into Google Sheets and Microsoft Excel, with more destinations coming in the future.

Thanks for reading!

  • Piotr Malek

    Technical Content Writer on who loves working with data, writing about it, and even producing videos about it. I’ve worked at startups and product companies, writing content for technical audiences of all sorts. You’ll often see me cycling🚴🏼‍♂️, backpacking around the world🌎, and playing heavy board games.

Back to Blog

Comments are closed.

Focus on your business
goals while we take care of your data!