Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

BigQuery Argument to Sink Big Query Data Fusion Pipeline failed

Greetings

I received the following error while executing Data Pipeline in GCP Cloud Data Fusion.

 

code 429 for io.cdap.cdap.internal.app.worker.Configuration Task 

Spark program 'phase-2' failed with error: Application application_1687440538763_0002 finished with failed status. Please check the system logs for more details 

 

pipeline arguments.png

summary log error Pipeline.png

So the Pipeline is responsible for filtering some arguments already created on a table  (Big Query), those filters deliver results on Sink BiqQuery. I deployed this pipeline and didn't show me issues but when I run it manually, showed me the errors shown above.

How Can I fix it?

 

Solved Solved
0 1 622
1 ACCEPTED SOLUTION

The error you are seeing in the log (java.io.IOException: Error occurred while importing data to BigQuery 'Existing table encryption settings do not match encryption settings specified in the request'. There are total error(s) for BigQuery job. Please look at BigQuery job logs for more informationis likely due to a mismatch between the encryption settings of the existing table in BigQuery and the settings specified in your request. One way to solve this problem is to change the encryption settings of the table in place by essentially copying the table onto itself with the new encryption settings.

One way to solve this problem is to change the encryption settings of the table in place by essentially copying the table onto itself with the new encryption settings. 

Here's how you can resolve this issue using the bq command-line tool:

  1. Use the bq cp command to copy the table onto itself with the new encryption settings. The bq cp command is used to copy tables. You can specify the same source and destination table, along with the -k or --destination_kms_key flag to specify the new encryption key. Here's an example:

     
    bq cp -f --destination_kms_key=projects/[PROJECT]/locations/[KMS_KEY_LOCATION]/keyRings/[KMS_KEY_RING]/cryptoKeys/[KMS_KEY] [DATASET].[TABLE] [DATASET].[TABLE]

    Replace [PROJECT], [DATASET], [TABLE], [KMS_KEY_LOCATION], [KMS_KEY_RING], and [KMS_KEY] with your own values.

    The -f flag is used to force the overwrite of the existing table, and --destination_kms_key is used to specify the new KMS key for the destination table.

  2. Check the encryption settings of your table to ensure they have been updated correctly. You can do this by using the bq show command and looking at the "Encryption configuration" section of the output:

     
    bq show --format=prettyjson [DATASET].[TABLE]

View solution in original post

1 REPLY 1

The error you are seeing in the log (java.io.IOException: Error occurred while importing data to BigQuery 'Existing table encryption settings do not match encryption settings specified in the request'. There are total error(s) for BigQuery job. Please look at BigQuery job logs for more informationis likely due to a mismatch between the encryption settings of the existing table in BigQuery and the settings specified in your request. One way to solve this problem is to change the encryption settings of the table in place by essentially copying the table onto itself with the new encryption settings.

One way to solve this problem is to change the encryption settings of the table in place by essentially copying the table onto itself with the new encryption settings. 

Here's how you can resolve this issue using the bq command-line tool:

  1. Use the bq cp command to copy the table onto itself with the new encryption settings. The bq cp command is used to copy tables. You can specify the same source and destination table, along with the -k or --destination_kms_key flag to specify the new encryption key. Here's an example:

     
    bq cp -f --destination_kms_key=projects/[PROJECT]/locations/[KMS_KEY_LOCATION]/keyRings/[KMS_KEY_RING]/cryptoKeys/[KMS_KEY] [DATASET].[TABLE] [DATASET].[TABLE]

    Replace [PROJECT], [DATASET], [TABLE], [KMS_KEY_LOCATION], [KMS_KEY_RING], and [KMS_KEY] with your own values.

    The -f flag is used to force the overwrite of the existing table, and --destination_kms_key is used to specify the new KMS key for the destination table.

  2. Check the encryption settings of your table to ensure they have been updated correctly. You can do this by using the bq show command and looking at the "Encryption configuration" section of the output:

     
    bq show --format=prettyjson [DATASET].[TABLE]