Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Snowflake to BigQuery migration

Could anyone please help me understand the following:
I need to migrate data from Snowflake to BigQuery using the BigQuery Data Transfer Service. Is this possible? Is there any documentation available on how to migrate data from Snowflake to BigQuery using the BigQuery Data Transfer Service? Additionally, is  it available for on-premise setups? In what scenarios would on-premise Snowflake be useful?

Solved Solved
4 2 1,029
1 ACCEPTED SOLUTION

Hi @Nikita_G ,

BigQuery doesn't currently support using Data Transfer Service with Snowflake. This means you'll need to use an alternative migration strategy.  Google Cloud provides an excellent documentation on Snowflake to BigQuery migration This guide covers schema changes, available tools, a sample export process, and even SQL translation options.

View solution in original post

2 REPLIES 2

Hi @Nikita_G ,

BigQuery doesn't currently support using Data Transfer Service with Snowflake. This means you'll need to use an alternative migration strategy.  Google Cloud provides an excellent documentation on Snowflake to BigQuery migration This guide covers schema changes, available tools, a sample export process, and even SQL translation options.

Hi @Nikita_G moving data from Snowflake to BigQuery requires a bit of creativity since Snowflake isn't a native source for BigQuery Data Transfer Service (BQ DTS). However, there are several alternative methods you can use to accomplish this effectively:

Alternative Approaches to Migrate Snowflake Data to BigQuery

1. Using a Cloud ETL Tool

ETL tools make it easy to integrate Snowflake with BigQuery by automating data transfers, transformations, and scheduling. For example, Windsor.ai supports Snowflake as a source and BigQuery as a destination. With it, you can:

  • Set up data pipelines quickly.
  • Ensure reliable transfers.
  • Enable incremental updates for efficiency.

This approach is ideal for those looking for simplicity and scalability without diving into custom scripts.

2. Custom Scripts with Cloud Storage as a Bridge

If you prefer a more hands-on solution, you can:

  1. Export Data from Snowflake: Use the COPY INTO command to export data as CSV or Parquet files.
  2. Upload to Google Cloud Storage: Transfer the exported files to a Cloud Storage bucket.
  3. Load Data into BigQuery: Use BigQuery’s native integration with Cloud Storage to import the data into your tables.

Example of the Snowflake COPY INTO Command:

 

 
COPY INTO 's3://your-bucket-name/your-folder/' FROM your_table FILE_FORMAT = (TYPE = PARQUET);

This method gives you full control over the migration process, though it requires some setup and monitoring.

3. Data Virtualization

For real-time use cases, you can consider virtualizing Snowflake data in BigQuery. Tools like Looker or other BI platforms allow you to query data from both sources simultaneously without needing a full migration. This is especially useful if you’re looking to avoid data duplication.

Special Considerations for On-Premise Snowflake

While Snowflake is primarily a cloud-based platform, there are rare scenarios where on-premise setups might be needed, such as:

  • Regulatory Compliance: Sensitive data restrictions that prevent cloud storage.
  • Legacy Infrastructure: Existing systems that favor on-premise solutions.

However, keep in mind that Snowflake is optimized for cloud environments, and on-premise setups are not common.

If you're looking for a simple and scalable way to connect Snowflake and BigQuery without writing custom scripts, an ETL tool like Windsor.ai is a great option. It offers a user-friendly interface and ensures a seamless data flow between platforms.

Hope this helps!