Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Renaming table in BigQuery using terraform

skj
Bronze 2
Bronze 2

 

Hi everyone,

I am working on a project where we utilize Terraform to manage all BigQuery infrastructure, including datasets, tables, and other resources. I have a scenario where I need to rename a BigQuery table. I understand that this is possible using a DDL command as documented here:
Renaming Table 

However, since Terraform does not natively support renaming tables, I am looking for guidance on:

  • How to handle this within Terraform
  • What are the best practices for managing such changes without breaking the Terraform state
  • Whether it's better to clone/create a new table and deprecate the old one
  • How to manage the terraform state or import correctly in such cases

I would appreciate any advice or recommended patterns from those who have faced similar situations.

Thanks!

Solved Solved
0 4 312
1 ACCEPTED SOLUTION

Hi @skj,

Terraform is not designed to handle data migration tasks like copying data between tables. Itโ€™s great at provisioning infrastructure, but when it comes to moving actual data, thatโ€™s considered outside its scope.

If you want everything orchestrated from a single YAML or Terraform configuration, here are a few common workarounds:

1. External Scripts Triggered via null_resource You can write a script (in Python, Bash, etc.) to copy data between tables and use the null_resource with provisioner "local-exec" to execute it as part of the Terraform run:

2. Terraform Modules with Embedded Logic (e.g., via Cloud SDKs) Some cloud providers expose SDK hooks or APIs you could wrap in a module to handle copying. This requires custom development, though, and still may rely on external scripts.

3. CI/CD Pipelines Calling Terraform + Scripts You can keep the YAML file as your source of truth (e.g., in GitHub Actions or Azure Pipelines), and let the pipeline run both Terraform and a data-copying step. Itโ€™s one orchestration file, just with staged commands.

Here's what to avoid:

  • Don't try to embed data-copy logic directly into Terraform resources like google_bigquery_table. Those resources donโ€™t support migrating data.
  • Avoid relying solely on terraform apply to assume data state transitions; you could end up with missing or duplicated data.

Since you're using Google Cloud and Terraform for data migration, here are some helpful resources to guide you:

1. Terraform on Google Cloud Docs: This is your go-to hub for provisioning infrastructure on GCP using Terraform. It includes tutorials, best practices, and how-tos for managing state, modules, and more

2. Terraform Registry: Offers detailed documentation on all Google Cloud resources supported by Terraform, including google_sql_database_instance, google_bigquery_table, and more.

3. Use Terraform with Dataform: If you're working with Dataform for analytics workflows, this guide shows how to manage repositories and workflows using Terraform

4. GitHub: terraform-google-sql-db Module: A community-maintained module for creating Cloud SQL instances with high availability, backups, and more. Great for production-ready setups.

Hope it helps!

View solution in original post

4 REPLIES 4

Hi @skj,

Welcome to Google Cloud Community!

Terraform currently manages BigQuery tables by their unique identifiers, which include the project_id, dataset_id, and table_id. While it doesnโ€™t support directly renaming BigQuery tables, changing the table_id in your google_bigquery_table resource prompts Terraform to recreate the table. This behavior helps ensure resource consistency and clarity, so itโ€™s important to handle such changes carefully to protect your data.

Here's how to approach this, including best practices and considerations:

  1. Manually Rename or Clone the Table: Use BigQuery DDL or Clone it.
  2. Update Terraform State Since Terraform still thinks the table is named old_table, youโ€™ll need to manually update the state, This tells Terraform that the resource now corresponds to the renamed table.
  3. Update Your Terraform Code Change the table_id in your .tf file to match the new table name.
  4. Run a Plan to Confirm Run terraform plan to ensure no changes are pending. If everything looks good, youโ€™re set.

For the best practices:

  • Avoid renaming tables frequently in Terraform-managed infra unless absolutely necessary.
  • Use deletion_protection = true to prevent accidental data loss during schema or name changes.
  • Document manual steps in your repo (e.g., in a README.md or MIGRATION.md) so future maintainers understand the context.
  • Use terraform import if the table was renamed outside of Terraform and you want to bring it under management again.

Here are some helpful references:

Was this helpful? If so, please accept this answer as โ€œSolutionโ€. If you need additional assistance, reply here within 2 business days and Iโ€™ll be happy to help

skj
Bronze 2
Bronze 2

Thanks for the detailed information.

Since I am not moving forward with maintaining the state, my approach would be:

  1. Create a new table using Terraform (new_table)
  2. Copy data from the old table to the new table (either outside or within Terraform)
  3. Destroy the old table

Is there a best practice for handling point 2, copying data from one table to another using Terraform? I want to manage everything from a single YAML file.

Hi @skj,

Terraform is not designed to handle data migration tasks like copying data between tables. Itโ€™s great at provisioning infrastructure, but when it comes to moving actual data, thatโ€™s considered outside its scope.

If you want everything orchestrated from a single YAML or Terraform configuration, here are a few common workarounds:

1. External Scripts Triggered via null_resource You can write a script (in Python, Bash, etc.) to copy data between tables and use the null_resource with provisioner "local-exec" to execute it as part of the Terraform run:

2. Terraform Modules with Embedded Logic (e.g., via Cloud SDKs) Some cloud providers expose SDK hooks or APIs you could wrap in a module to handle copying. This requires custom development, though, and still may rely on external scripts.

3. CI/CD Pipelines Calling Terraform + Scripts You can keep the YAML file as your source of truth (e.g., in GitHub Actions or Azure Pipelines), and let the pipeline run both Terraform and a data-copying step. Itโ€™s one orchestration file, just with staged commands.

Here's what to avoid:

  • Don't try to embed data-copy logic directly into Terraform resources like google_bigquery_table. Those resources donโ€™t support migrating data.
  • Avoid relying solely on terraform apply to assume data state transitions; you could end up with missing or duplicated data.

Since you're using Google Cloud and Terraform for data migration, here are some helpful resources to guide you:

1. Terraform on Google Cloud Docs: This is your go-to hub for provisioning infrastructure on GCP using Terraform. It includes tutorials, best practices, and how-tos for managing state, modules, and more

2. Terraform Registry: Offers detailed documentation on all Google Cloud resources supported by Terraform, including google_sql_database_instance, google_bigquery_table, and more.

3. Use Terraform with Dataform: If you're working with Dataform for analytics workflows, this guide shows how to manage repositories and workflows using Terraform

4. GitHub: terraform-google-sql-db Module: A community-maintained module for creating Cloud SQL instances with high availability, backups, and more. Great for production-ready setups.

Hope it helps!

You can use null_resource in terraform to run commands for renaming table.