i'm actually managing a project on GCP where there is an application made up of 2 virtual instances. One of this VM has a "data" hard disk ( logical volume of 5 tb made of 5 HD ) that is constantly used by the application: kinda CRM with a lots of data.
Using the Cloud Storage Bucket seems to be one possible way to achieve the "daily backup" but i was wondering if "rsyncing" 5tb ( and more every day ) into the bucket is the best way to go. considering that Google does not recommend use "gcloud storage rsync" command for more than 1tb, the time to complete the task seems to be X3 of a normal rsync Actually i've read also about "Filestore" service but seems to be too expensive for our budget. Considering also the costs, i was wondering if is it better to create directly another VM with a 5tb clone? There are other possible ways ?
Thanks a lot for the support!
Hello @massimo1980 ,Welcome on Google Cloud Community.
Did you've considered Backup and DR Service ?
https://cloud.google.com/backup-disaster-recovery?hl=en
--
cheers,
DamianS
LinkedIn medium.com Cloudskillsboost
Hi Damian and thanks for the reply.
I was reading about the "Backup and DR Service" but seems to cover the entire VM. We need to assure only the 5tb of data. Also i can't find a pricing table to evalute the costs of this operation.
Another bad point in using the Backup&DR service is that, in case of emergency, we can't restore ( for example ) some files but we have to restore the complete backup.
i was thinking about using Cloud Storage Bucket, as i mentioned before, but when we tried to create a scheduled "everyday" Storage Transfer task for backup, we have been charged of 100$ for transferring like 150gb for testing.
And, as i mentioned before, Google does not recommend use "gcloud storage rsync" for this kind of capacity.
Actually seems to me that the cheapest and fastest way to achieve the goal is to create another VM with same HD space.
Any other ideas ?