Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Transfer BigQuery Table to Composer Bucket

Hi @ms4446 

 

Here is my problem description:

1) I am running an Airflow DAG and I am using the Cloud Composer. I have generated a table in BigQuery. I now want to transfer this table as a csv to Composer bucket.

2) This is because I want to send the csv as an email to the final customer

3) I will be using EmailOperator and the EmailOperator will expect the csv to live in Composer Bucket

4) Can I use BigQuerytoGCSOperator to tarnsfer the table to Composer bucket (like we normally transfer to a GCS bucket from BigQuery)?

5) My requirement is to send a csv generated through a BQ table as email attachment and this what I figured out

Any tips?

Questions

6) How to transfer csv from BQ table to Composer Bucket?

7) Do I have to sue Composer API: https://cloud.google.com/composer/docs/how-to/using/managing-dags#python

😎 Can I use BigQuerytoGCSOperator?

9)IS EmailOperator the best solution for this use case?

https://cloud.google.com/composer/docs/how-to/using/managing-dags#python

Solved Solved
0 5 656
1 ACCEPTED SOLUTION

You're correct that rename_blob is a more direct way to move a file from one location to another within the same bucket. However, the key limitation of rename_blob is that it only works within the same bucket. It essentially performs a copy followed by a delete operation internally, but it does so within the same GCS bucket.

View solution in original post

5 REPLIES 5