Back to Blog

How to Copy BigQuery Table – All the Ways Explained in One Tutorial 

Copying data in spreadsheets is a piece of cake – you can either copy and paste a cell range or make a copy of a sheet. In SQL-based data warehouses, which BigQuery is, the process of copying tables is different, but not complex. The major difference is that you can copy a BigQuery table in different ways depending on your purpose for this action. In this tutorial, I’ve collected all of the options you can rely on and explained their details so you can choose the best one for your project. Let’s get started!

What are the options in BigQuery to copy a table?

  • BigQuery console – the most primitive way to manually copy a BigQuery table with a few button clicks. Using this option, you can copy BigQuery tables to another dataset and project.
  • SQL query – an alternative way to copy BigQuery tables using SQL queries.
  • Coupler.io – the most actionable option to copy a BigQuery table with the help of a third-party solution, Coupler.io. It allows you to not only copy tables in BigQuery, but also automate dataflow between them, data consolidation, and other things valuable for reporting and analytics.
  • CLI tools – a manual way to copy BigQuery tables using the BigQuery command-line tool.
  • BigQuery API – a programmatic way to copy tables in BigQuery via sending POST requests to the BigQuery API. This is the most advanced option since it requires you to have coding skills. 

Definitely, each option is a go for a specific case. If you need to copy data from one table to another in BigQuery one time, the Console or SQL option will do. For recurring activity, let’s say to automate data flow from one table to another on a daily basis, then Coupler.io is the ideal solution. I’ll guide you through each way to see what they look like in practice. This will allow you to better understand how they work and which one you’d like to employ for your needs. Let’s start with the simplest option.

Option #1 – Copy a BigQuery table to another table using the console

I have a BigQuery table named csv-data located in the demo dataset of the couplerio-demo project. To simply create a copy of this BigQuery table, click the Copy button on the console.

0.1 copy table bigquery

Then you need to specify the destination for your copy namely:

  • Project
  • Dataset
  • Table (the name for the copied table)

Click Copy to put this in action.

0.2 copy table bigquery

There you go! Here is what the copied table looks like.

0.3 copy table bigquery

Option #2 – Copy a table in BigQuery using SQL query

This method allows you to use the power of SQL to create a new table and copy data from another table into it. Here is what the syntax for such a query looks like:

CREATE TABLE `project-name.dataset-name.new-table-name`
COPY `project-name.dataset-name.table-name`;
  • CREATE TABLE
    • project-name – specify the name of the project where to create a new table
    • dataset-name – specify the name of the dataset where to create a new table
    • new-table-name – specify the name for the new table
  • COPY
    • project-name – specify the name of the project where the table to be copied is located
    • dataset-name – specify the name of the dataset where the table to be copied is located
    • table-name – specify the name of the table to be copied 

Here is an example of what it may look like in practice:

CREATE TABLE `couplerio-demo.demo.csv_data_copy2`
COPY `couplerio-demo.demo.csv-data`;
1.1 copy table bigquery sql query

Click Run to run your SQL query and enjoy the result.

1.2 copy table bigquery sql query

Option #3 – Copy a table in BigQuery using Coupler.io

Let’s explore how you can not only copy tables in BigQuery but also connect them using Coupler.io.

Coupler.io is a data automation and analytics platform. It provides an ETL solution to automate exports of data from multiple sources into Google Sheets, Microsoft Excel, and Google BigQuery. 

Using the BigQuery integration by Coupler.io, you can copy a source BigQuery table and synchronize it with the created copy. Here is what the flow looks like. You can watch it on our YouTube channel.

  • Sign up to Coupler.io, click Add new importer, and select the source app and destination apps. In our case, BigQuery works for both apps. 
2.0 bigquery source destination

After that, configure both connections as follows:

Source table

  • Connect to a BigQuery account to access a BigQuery project where your source table is located. For this, you’ll need to select a Google Cloud key file stored on your device and click Save

Here are detailed instructions on how to get a Google Cloud key file.

  • Enter the SQL query to copy an entire table. Use the following syntax
SELECT * FROM `{project}.{data-set}.{table}` 

In my example, to copy a BigQuery table named csv-data located in the demo dataset of the couplerio-demo project, the SQL query will look like this

SELECT * FROM `couplerio-demo.demo.csv-data`
2.1 bigquery source coupler

Click Continue to proceed to the destination table configuration.

Destination table

  • If you’re making a BigQuery table copy within the same project, select the project from the drop-down list. If you’re going to copy the table to another project, you’ll need to connect it using Google Cloud key file, as I did when configuring the source.
  • Then you’ll need to either specify an existing dataset and a table or create new ones by typing in the new names. I want to use the same demo dataset and create a new table named csv-data-copy-coupler

Let’s make a BigQuery table copy first and then I’ll explain how you can connect the source and destination tables to enable the automatic data refresh. To make a copy, click Save and Run

2.3 bigquery destination coupler save run

Once the import is complete, you can click the View Results button to open the copied table. And here is what it looks like in BigQuery.

2.4 bigquery destination coupler copy

Automate BigQuery table copying and consolidation

As promised, I want to talk about a few awesome features you will definitely benefit from when using Coupler.io to copy tables in BigQuery.

First, you can automate this process by enabling the automatic data refresh and configuring the schedule. You can do this both during the importer’s initial setup and after the import has already been run.

2.5 bigquery coupler copy table schedule

The second feature to check out is data consolidation or data stitching. The idea is that you can copy data from multiple tables within one project/dataset or from different ones, and consolidate this data into one destination table. For this, you need to click the Add one more source button after you’ve configured the source table to copy.

2.6 bigquery coupler copy table schedule add more

You can add as many source tables as you want but all of them will be merged into one destination table. If you need to make copies of multiple tables without merging them, you’ll have to create one importer per each table separately.

I’ve already introduced three ways to copy a BigQuery table to another table. Before we proceed to the other methods, let’s answer two questions that you may have after reading this.

How to copy a BigQuery table to another dataset

All the introduced options allow you to copy BigQuery table to another dataset. 

  • BigQuery console: you need to specify the desired dataset in the Destination section.
  • SQL query: you need to specify the desired dataset when making your query.
  • Coupler.io: you need to specify the desired dataset when configuring the destination table.

How to copy a BigQuery table to another project

All the introduced options allow you to copy BigQuery tables to another project if you have respective access to it. 

  • BigQuery console: you need to specify the desired project in the Destination section. You can click Browse to do this.
  • SQL query: you need to specify the desired project when making your query.
  • Coupler.io: you need to connect a Google BigQuery account with access to the associated BigQuery project using Google Cloud key file. 

Now I can proceed to the other two methods you can consider to copy BigQuery tables: CLI and API.

Option #4 – Copy a BigQuery table using the command-line tool

BigQuery provides a Python-based command-line tool to manipulate data. You can enter CLI commands in two ways:

  • from the Google Cloud console – for this, you need to activate Cloud Shell 
  • from the Google Cloud CLI – for this, you need to install and configure the Google Cloud CLI.

I’m going to demonstrate how you can copy a BigQuery table using the Google Cloud CLI.

Prerequisites: Make sure to install the latest version of the Google Cloud SDK and authenticate with your Google Cloud account. After the installation, you’ll need to set your default project.

Here is the syntax of a command to copy a BigQuery table:

bq cp [source-table] [destination-table]
  • [source-table] should be specified in the following format: dataset-name.table-name
  • [destination-table] should be specified in the following format: dataset-name.table-name

Here is what the command looks like in my example

bq cp demo.csv-data demo.csv_data_copy3
3.1 copy table bigquery cli

And here is the result – a copied BigQuery table. 

3.2 copy table bigquery cli

This is a basic example. However, with CLI commands, you can specify additional options when copying tables in BigQuery.  For example, here is a syntax for a command to copy a table to another project using the schema defined in the schema file

bq cp --destination_project=[destination-project] --destination_table=[destination-table] --schema=[schema-file] [source-table]
  • [destination-project] – specify the name of the BigQuery project where you want to copy a table
  • [destination-table] should be specified in the following format: dataset-name.table-name
  • [schema-file] – specify a .json file with a table schema
  • [source-table] should be specified in the following format: dataset-name.table-name

Option #5 – Copy a table using the BigQuery API

Eventually, the most thorny way to copy a table in BigQuery – the API. To use this option, you’ll need the following:

  • A Google Cloud Platform (GCP) project with the BigQuery API enabled.
  • A service or user account with permission to read the source table and write to the destination table.
  • The BigQuery API client library for your programming language, for example, Python, installed in your development environment.

Is everything checked? Then buckle up 🙂

Example of how to copy a BigQuery table to a dataset with Python

Let’s check out an example of using Python to copy a BigQuery table via the BigQuery API.

  • First, authenticate with your Google Cloud account and create a new BigQuery client object. Check out the BigQuery API documentation where you will find the respective authentication and client creation instructions.
  • Define the ‘CopyJob‘ configuration object with the following properties:
    • source_table: in the following format: project-name:dataset-name.table-name
    • destination_table: in the following format: project-name:dataset-name.table-name
    • job_config: define any additional configuration options, such as the schema or write disposition.

Here is an example of how this may look in Python:

from google.cloud import bigquery

client = bigquery.Client()
job_config = bigquery.CopyJobConfig()
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE

source_table = "my-project:my-dataset.my-table"
destination_table = "my-project:my-dataset.my-table-copy"
copy_job = client.copy_table(source_table, destination_table, job_config=job_config)

The next step is to submit the ‘CopyJob‘ to BigQuery by calling the result() method on the ‘CopyJob‘ object. This will execute the copy job asynchronously and return a ‘Table‘ object representing the copied table.

copied_table = copy_job.result()

That’s it. Oh, if you want to monitor the progress of the ‘CopyJob‘, you can optionally use the ‘CopyJob‘ object’s state property and the ‘job_id‘ property to periodically check the status using the client.get_job() method. Here is what it may look like:

while copy_job.state != "DONE":
    print(f"Job {copy_job.job_id} is {copy_job.state.lower()}...")
    time.sleep(1)

if copy_job.error_result:
    print(f"Job {copy_job.job_id} failed: {copy_job.error_result['message']}")
else:
    print(f"Table {destination_table} was successfully copied!")

The status of the job will be printed every second until it is done or failed.

FAQ on BigQuery copy table 

The following questions and answers to them should be quite useful since they provide you with a full picture of how you can copy tables in BigQuery.

Can I copy a BigQuery table without data?

There is only one reason why anyone would need to copy a table without data in BigQuery – they need to copy its schema to reuse it. All the options I introduced in the article will let you make a copy of a BigQuery table, which contains no data. 

The more interesting case is when you need to only copy the schema of a non-empty BigQuery table but without this data. 

Copy the BigQuery table schema

In BigQuery, you can benefit from the CREATE TABLE LIKE statement to only copy the schema of a source table. Here is the syntax you should use for your SQL query:

CREATE TABLE `project-name.dataset-name.new-table-name`
LIKE `project-name.dataset-name.table-name`;
  • CREATE TABLE
    • project-name – specify the name of the project where to create a new table
    • dataset-name – specify the name of the dataset where to create a new table
    • new-table-name – specify the name for the new table
  • LIKE
    • project-name – specify the name of the project where the table to be copied is located
    • dataset-name – specify the name of the dataset where the table to be copied is located
    • table-name – specify the name of the table to be copied 

Here is what the SQL query example may look like:

CREATE TABLE `couplerio-demo.demo.table_schema_copy`
LIKE `couplerio-demo.demo.csv-data`;

When you run it, you’ll get a copy of the specified tables with no data in it.

4.0 bigquery table schema copy

Limitations on BigQuery tables copy

If you ask chatGPT on limitations associated with BigQuery table copy, it would likely reply something like: 

It’s a good idea to review the BigQuery documentation and check any relevant quotas and permissions before attempting to copy a table in BigQuery.

And it’s true. So I’ve done this and here are the core limitations/requirements you should be aware of:

  • Permissions: To copy a BigQuery table, you need two permissions:
    • to read the source table 
    • to write to the destination table
  • Quotas: Your BigQuery account has quotas on the number of table copy operations. Those quotas can be defined per day, per project, and per destination dataset. You can check out your current quotas in IAM & Admin of the GCP Console.
4.1 bigquery table copy quotas
  • Limitations of specific copy methods: These limitations are associated with the method you use to copy a BigQuery table.

Which method to copy BigQuery tables is the best?

To put it simply, I believe that Coupler.io is the best method to copy your BigQuery table 😁

But, I can’t be biased, therefore, I would say that the best method should be defined according to your requirements and needs. So, if you need to make a table copy just once, then there is no sense in signing up for Coupler.io. A few button clicks on the BigQuery console will do the job quickly and efficiently. 

However, if you deal with this activity on a recurring basis, then it makes sense to opt for the SQL option. Coupler.io will be the best if you want to automate table copying for reporting or analytics. 

Choose the best method for you and do this wisely. Good luck!

  • Zakhar Yung

    A content manager at Coupler.io whose key responsibility is to ensure that the readers love our content on the blog. With 5 years of experience as a wordsmith in SaaS, I know how to make texts resonate with readers' queries✍🏼

Back to Blog

Comments are closed.

Focus on your business
goals while we take care of your data!

Try Coupler.io