Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

syncing 500GB of data from one GCP bucket to another

I am trying to sync 500GB of data from one GCP bucket to another across project. Using below command I am able to initiate the sync and it is copying files. But it is taking way long(in 2 days copied 50GB). Is there a faster way around it. Note: I have nested folders in source bucket and file count is around 74 Million.

gsutil -m rsync -r gs://source_bucket gs://destination_bucket

0 3 174
3 REPLIES 3

Where are you running this command from?

running it from local PC via cmd

It seems like it might be pulling the files down to your local PC and sending back over the wire if running locally.  Could you run the same command from a VM in GCP or the Cloud Shell?