Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Transfer csv file from BigQuery to GCS without loading it into memory

Consider the following code:

 

#Initialize GCS client
    gcs_client = storage.Client()
    

    # Define the destination URI for the export
    gcs_uri = f'gs://{gcs_bucket_name}/{gcs_blob_name}'
    

    try:
        # Export BigQuery table to GCS as CSV
        job_config = bigquery.job.ExtractJobConfig()
        job_config.destination_format = bigquery.DestinationFormat.CSV
        bq_client.extract_table(
            table_id,
            gcs_uri,
            job_config=job_config
        )

        # Copy the CSV file to gcs
        gcs_blob = gcs_client.bucket(gcs_bucket_name).blob(gcs_blob_name)

 

Now, in the above code, I am doing an extract of the data through the line below. Therefore, the data is loaded into memory

 

        bq_client.extract_table(
            table_id,
            gcs_uri,
            job_config=job_config

 

Is it possible to transfer csv file to the gcs without loading the data into memory? CC: @ms4446 

Solved Solved
1 2 731
1 ACCEPTED SOLUTION

Yes, it is possible to transfer a CSV file from Google BigQuery to Google Cloud Storage (GCS) without loading the data into your local memory. The process you're referring to in your code snippet is actually designed to work in such a way that the data is not loaded into the client's memory but is directly transferred from BigQuery to GCS.

Yes, it is indeed possible to transfer a CSV file from Google BigQuery to Google Cloud Storage (GCS) without loading the data into memory, especially in the context of the environment where this code is executed (e.g., a server or cloud function). The process described in your code snippet does exactly that by leveraging Google Cloud's server-side operations.

Here's a clarification on how the process works and why it does not load the data into the client's memory:

  1. Server-Side Operation: The bq_client.extract_table method initiates a server-side job in Google Cloud that exports the specified BigQuery table to a file in GCS in the CSV format. This operation is performed entirely within Google Cloud's infrastructure.

  2. No Local Memory Usage: At no point does this process require downloading the table data to your local environment or the memory of the machine where the script is running. Instead, the data moves directly from BigQuery to GCS within the cloud.

  3. Asynchronous Job: The extract_table method creates an asynchronous job on Google Cloud. To ensure the job completes, you should wait for the job to finish by calling .result() on the job object returned by extract_table. This step is missing in your snippet but is crucial for confirming the operation's success.

 
from google.cloud import bigquery
from google.cloud import storage

# Initialize BigQuery and GCS clients
bq_client = bigquery.Client()
gcs_client = storage.Client()

# Define BigQuery table ID and GCS URI (replace placeholders!)
table_id = 'your-project.your-dataset.your-table'
gcs_bucket_name = 'your-bucket-name'
gcs_blob_name = 'your-desired-file-name.csv'
gcs_uri = f'gs://{gcs_bucket_name}/{gcs_blob_name}'

try:
    # Export BigQuery table to GCS as CSV
    job_config = bigquery.job.ExtractJobConfig()
    job_config.destination_format = bigquery.DestinationFormat.CSV

    extract_job = bq_client.extract_table(
        table_id,
        gcs_uri,
        job_config=job_config
    )
    extract_job.result()  # Wait for the job to complete

    print(f"Exported {table_id} to {gcs_uri}")

except Exception as e:
    print(e)

View solution in original post

2 REPLIES 2

Yes, it is possible to transfer a CSV file from Google BigQuery to Google Cloud Storage (GCS) without loading the data into your local memory. The process you're referring to in your code snippet is actually designed to work in such a way that the data is not loaded into the client's memory but is directly transferred from BigQuery to GCS.

Yes, it is indeed possible to transfer a CSV file from Google BigQuery to Google Cloud Storage (GCS) without loading the data into memory, especially in the context of the environment where this code is executed (e.g., a server or cloud function). The process described in your code snippet does exactly that by leveraging Google Cloud's server-side operations.

Here's a clarification on how the process works and why it does not load the data into the client's memory:

  1. Server-Side Operation: The bq_client.extract_table method initiates a server-side job in Google Cloud that exports the specified BigQuery table to a file in GCS in the CSV format. This operation is performed entirely within Google Cloud's infrastructure.

  2. No Local Memory Usage: At no point does this process require downloading the table data to your local environment or the memory of the machine where the script is running. Instead, the data moves directly from BigQuery to GCS within the cloud.

  3. Asynchronous Job: The extract_table method creates an asynchronous job on Google Cloud. To ensure the job completes, you should wait for the job to finish by calling .result() on the job object returned by extract_table. This step is missing in your snippet but is crucial for confirming the operation's success.

 
from google.cloud import bigquery
from google.cloud import storage

# Initialize BigQuery and GCS clients
bq_client = bigquery.Client()
gcs_client = storage.Client()

# Define BigQuery table ID and GCS URI (replace placeholders!)
table_id = 'your-project.your-dataset.your-table'
gcs_bucket_name = 'your-bucket-name'
gcs_blob_name = 'your-desired-file-name.csv'
gcs_uri = f'gs://{gcs_bucket_name}/{gcs_blob_name}'

try:
    # Export BigQuery table to GCS as CSV
    job_config = bigquery.job.ExtractJobConfig()
    job_config.destination_format = bigquery.DestinationFormat.CSV

    extract_job = bq_client.extract_table(
        table_id,
        gcs_uri,
        job_config=job_config
    )
    extract_job.result()  # Wait for the job to complete

    print(f"Exported {table_id} to {gcs_uri}")

except Exception as e:
    print(e)

Thank you @ms4446 , it is really appreciated.