I have to create a ETL process, where I need o read data, daily from a on premisse database.
I can´t find how could a do it in GCP. Every exemple that I saw refer to export data to "csv" files, and after import it to bucket. Or realtime data monitoring.
I need some simple, read data once a day, direct from the on premise database, to process, and after a will store it data on a BigQuery table.
What is the best path do to it, using GCP?
Solved! Go to Solution.
Hi
You can create python script which can connect to your on-prem database and fetch data as well as load data in BQ.
Schedule python script using Cloud composer if you have already setup or use cloud function execute python code.
Note - you have to sort way to authenticate/connect to on-prem DB as well as BQ.
Hi
You can create python script which can connect to your on-prem database and fetch data as well as load data in BQ.
Schedule python script using Cloud composer if you have already setup or use cloud function execute python code.
Note - you have to sort way to authenticate/connect to on-prem DB as well as BQ.