Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Renaming table in BigQuery using terraform

skj
Bronze 2
Bronze 2

 

Hi everyone,

I am working on a project where we utilize Terraform to manage all BigQuery infrastructure, including datasets, tables, and other resources. I have a scenario where I need to rename a BigQuery table. I understand that this is possible using a DDL command as documented here:
Renaming Table 

However, since Terraform does not natively support renaming tables, I am looking for guidance on:

  • How to handle this within Terraform
  • What are the best practices for managing such changes without breaking the Terraform state
  • Whether it's better to clone/create a new table and deprecate the old one
  • How to manage the terraform state or import correctly in such cases

I would appreciate any advice or recommended patterns from those who have faced similar situations.

Thanks!

Solved Solved
0 4 208
1 ACCEPTED SOLUTION

Hi @skj,

Terraform is not designed to handle data migration tasks like copying data between tables. It’s great at provisioning infrastructure, but when it comes to moving actual data, that’s considered outside its scope.

If you want everything orchestrated from a single YAML or Terraform configuration, here are a few common workarounds:

1. External Scripts Triggered via null_resource You can write a script (in Python, Bash, etc.) to copy data between tables and use the null_resource with provisioner "local-exec" to execute it as part of the Terraform run:

2. Terraform Modules with Embedded Logic (e.g., via Cloud SDKs) Some cloud providers expose SDK hooks or APIs you could wrap in a module to handle copying. This requires custom development, though, and still may rely on external scripts.

3. CI/CD Pipelines Calling Terraform + Scripts You can keep the YAML file as your source of truth (e.g., in GitHub Actions or Azure Pipelines), and let the pipeline run both Terraform and a data-copying step. It’s one orchestration file, just with staged commands.

Here's what to avoid:

  • Don't try to embed data-copy logic directly into Terraform resources like google_bigquery_table. Those resources don’t support migrating data.
  • Avoid relying solely on terraform apply to assume data state transitions; you could end up with missing or duplicated data.

Since you're using Google Cloud and Terraform for data migration, here are some helpful resources to guide you:

1. Terraform on Google Cloud Docs: This is your go-to hub for provisioning infrastructure on GCP using Terraform. It includes tutorials, best practices, and how-tos for managing state, modules, and more

2. Terraform Registry: Offers detailed documentation on all Google Cloud resources supported by Terraform, including google_sql_database_instance, google_bigquery_table, and more.

3. Use Terraform with Dataform: If you're working with Dataform for analytics workflows, this guide shows how to manage repositories and workflows using Terraform

4. GitHub: terraform-google-sql-db Module: A community-maintained module for creating Cloud SQL instances with high availability, backups, and more. Great for production-ready setups.

Hope it helps!

View solution in original post

4 REPLIES 4