What is the Google-recommended product for continuous data transfer (batch/real-time) from an Azure bucket to a GCP bucket? While STS is a common solution, it uses public routes over HTTPS. Is there a recommended approach for transferring data over a private network once connectivity (Express Route + Interconnect) between Azure and GCP is established? Specifically, I’m looking for a solution that can operate entirely over a private network, given that STS cannot use private networks for such transfers due to its reliance on publicly resolvable endpoints.
Hi, @Nikp. Have you checked the Google Storage Transfer Service for this scenario? If so, what bottlenecks have you encountered? For more information on the available and recommended data transfer options, please refer to this documentation - Data transfer options. And, Azure related configuration is documented on here - Configure access to a source: Microsoft Azure Storage.
Regards,
Mokit
Thank you for your response. Let me revise my question. To clarify, I am specifically asking about the ability to configure STS for private network usage, as opposed to the general use of STS.
If STS cannot support this, are there alternative solutions for transferring data securely over a private network between Azure and GCP?
If using STS with private connectivity isn’t an option for your scenario, i will go with this approach - create a custom data transfer solution using the private connectivity that have already established between Azure and Google Cloud (using ExpressRoute for Azure and Interconnect for GCP). Here’s how you can do this:
Regards,
Mokit
Thank you for your response. This solution looks effective for scenarios with a smaller amount of data to transfer. However, for continuous bidirectional data transfer in batch for large volumes of data, will this approach be scalable and efficient? I want to ensure that the solution can handle high data throughput and maintain performance over time. Your insights on this would be appreciated.