Terraform import google_bigquery_data_transfer_config give no object exists error

I have GCP project which use bigquery schedules queries. Now I'm trying to import these scheduled queries in bigquery to terraform plan. But it gives me the following error message when i runt this command.

 

terraform plan -generate-config-out=generated.tf

│ Error: Cannot import non-existent remote object
│ 
│ While attempting to import an existing object to
│ "google_bigquery_data_transfer_config.default", the provider detected that no
│ object exists with the given id. Only pre-existing objects can be imported;
│ check that the id is correct and that it is associated with the provider's
│ configured region or endpoint, or use "terraform apply" to create a new
│ remote object for this resource.

 

 

To test it I'm trying to import one resource, my terraform import file contains one import block like this

 

 

import {
  id = "scheduled_query_1"
  to = google_bigquery_data_transfer_config.default
}

 

 


I'm using Terraform v1.6.6 & google provider v5.14.0 and refer this official documentation https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigquery_data_transfe...  

I believe that the documentation is incorrect (name is not enough), or maybe I'm missing something. Please help to resolve this issue.

Regards  



Solved Solved
0 2 657
1 ACCEPTED SOLUTION

The error "Cannot import non-existent remote object" indicates that Terraform cannot find an existing BigQuery Data Transfer Config in your Google Cloud Project with the ID you've provided. This could be due to several reasons:

Causes

  • Incorrect ID: The ID used in the import command may not accurately reflect the ID of the existing scheduled query in GCP.
  • Region Mismatch: Terraform operations are region-specific. If the region in your Terraform configuration doesn't align with the region of the scheduled query, Terraform won't be able to locate it.
  • Permissions Issue: The service account or credentials used by Terraform might not have the necessary permissions to access BigQuery data transfer configurations.
  • Incorrect Resource Name: The documentation might be unclear, requiring a more detailed resource name for successful import.

Troubleshooting Steps

  1. Verify the ID:

    • Navigate to BigQuery -> Transfer Services in the Google Cloud Console.
    • Locate the scheduled query you wish to import and copy its Transfer Config ID, typically in the format projects/[PROJECT_ID]/locations/[LOCATION]/transferConfigs/[TRANSFER_CONFIG_ID].
    • Alternatively, advanced users can obtain the Transfer Config ID through the Data Transfer Service API.
  2. Check Region:

    • Confirm the region of your scheduled query in the Google Cloud Console.
    • Ensure your Terraform provider configuration specifies the same region.
  3. Review Permissions:

    • Verify that the service account or credentials Terraform uses have the necessary BigQuery permissions. For detailed information on required permissions, refer to the BigQuery permissions documentation.
    • At a minimum, ensure the following roles are assigned: roles/bigquery.dataViewer and roles/bigquery.transfers.viewer.
  4. Full Resource Name in Import Command:

    • Use the complete resource name in your Terraform import command, like so:
       
      terraform import google_bigquery_data_transfer_config.default "projects/[PROJECT_ID]/locations/[LOCATION]/transferConfigs/[TRANSFER_CONFIG_ID]"

Example Workflow

  • Acquire the Correct ID:

     
    id="projects/my-project/locations/us-central1/transferConfigs/my-transfer-config"
  • Prepare the Import Command: Use the terraform import command with the full ID of the BigQuery Data Transfer Config:

     
    terraform import google_bigquery_data_transfer_config.default "projects/my-project/locations/us-central1/transferConfigs/my-transfer-config"

Important Notes:

  • Replace placeholders like [PROJECT_ID], [LOCATION], and [TRANSFER_CONFIG_ID] with your actual resource details.
  • Ensure you're using the terraform import command for importing resources into Terraform. The terraform plan command is not used for importing.
  • The import syntax and requirements may vary slightly depending on your Terraform version. Always consult the latest provider documentation for the most accurate information.

View solution in original post

2 REPLIES 2

The error "Cannot import non-existent remote object" indicates that Terraform cannot find an existing BigQuery Data Transfer Config in your Google Cloud Project with the ID you've provided. This could be due to several reasons:

Causes

  • Incorrect ID: The ID used in the import command may not accurately reflect the ID of the existing scheduled query in GCP.
  • Region Mismatch: Terraform operations are region-specific. If the region in your Terraform configuration doesn't align with the region of the scheduled query, Terraform won't be able to locate it.
  • Permissions Issue: The service account or credentials used by Terraform might not have the necessary permissions to access BigQuery data transfer configurations.
  • Incorrect Resource Name: The documentation might be unclear, requiring a more detailed resource name for successful import.

Troubleshooting Steps

  1. Verify the ID:

    • Navigate to BigQuery -> Transfer Services in the Google Cloud Console.
    • Locate the scheduled query you wish to import and copy its Transfer Config ID, typically in the format projects/[PROJECT_ID]/locations/[LOCATION]/transferConfigs/[TRANSFER_CONFIG_ID].
    • Alternatively, advanced users can obtain the Transfer Config ID through the Data Transfer Service API.
  2. Check Region:

    • Confirm the region of your scheduled query in the Google Cloud Console.
    • Ensure your Terraform provider configuration specifies the same region.
  3. Review Permissions:

    • Verify that the service account or credentials Terraform uses have the necessary BigQuery permissions. For detailed information on required permissions, refer to the BigQuery permissions documentation.
    • At a minimum, ensure the following roles are assigned: roles/bigquery.dataViewer and roles/bigquery.transfers.viewer.
  4. Full Resource Name in Import Command:

    • Use the complete resource name in your Terraform import command, like so:
       
      terraform import google_bigquery_data_transfer_config.default "projects/[PROJECT_ID]/locations/[LOCATION]/transferConfigs/[TRANSFER_CONFIG_ID]"

Example Workflow

  • Acquire the Correct ID:

     
    id="projects/my-project/locations/us-central1/transferConfigs/my-transfer-config"
  • Prepare the Import Command: Use the terraform import command with the full ID of the BigQuery Data Transfer Config:

     
    terraform import google_bigquery_data_transfer_config.default "projects/my-project/locations/us-central1/transferConfigs/my-transfer-config"

Important Notes:

  • Replace placeholders like [PROJECT_ID], [LOCATION], and [TRANSFER_CONFIG_ID] with your actual resource details.
  • Ensure you're using the terraform import command for importing resources into Terraform. The terraform plan command is not used for importing.
  • The import syntax and requirements may vary slightly depending on your Terraform version. Always consult the latest provider documentation for the most accurate information.

My issue is related to provided ID.
Once i changed the id to this format as mentioned in @ms4446  response
projects/[PROJECT_ID]/locations/[LOCATION]/transferConfigs/[TRANSFER_CONFIG_ID]
it resolved the issue.

Thanx a lot for your detailed explanation and quick response @ms4446