Hi All.
I have a prospect lead which have a SingleStore database in their local infrastructure and they want to send 300 GB as batch process to storage in BigQuery.
Please ¿Have you an idea or guidance how to implement that integration solution? ¿Can you recommend us which service or tool we can use from GCP?
Thanks in advance
Solved! Go to Solution.
@fbermeoq This can be one of the solution => https://www.cdata.com/kb/tech/singlestore-sync-bigquery.rst
[One time full load solution] Also You can read data from singlestore using pyspark jobs running on GCP Dataproc (or Dataproc serverless) which reads those table and dumps it into BQ
I would recommend you to talk with Google staff , as they might provide you with much better solutions after studying your usecase
@fbermeoq This can be one of the solution => https://www.cdata.com/kb/tech/singlestore-sync-bigquery.rst
[One time full load solution] Also You can read data from singlestore using pyspark jobs running on GCP Dataproc (or Dataproc serverless) which reads those table and dumps it into BQ
I would recommend you to talk with Google staff , as they might provide you with much better solutions after studying your usecase