Hi,
I am new to GCP. I have a case scenario. I have 180 million data in one of the GCP Sql table under DB_Test6V database. I need to load this data into On-prem Sql server table through SSIS tool. With current ODBC data source or ADO.NET source it takes more than 2 hours just to load 9 million data and then the connection timed out. Is there any better way to load such huge data into Sql server table in less time?
Just to clarify, is your source BigQuery or Cloud SQL? Since your question contradicts the question title.
Big Query
Thank you for confirming. Kindly see this stackoverflow post which is similar to your question.