I need to go through several hundred GCP projects and total the GCS Bucket size for some reporting.
Using bash scripts and "gsutil du -s <bucket>" is not efficient when working through thousands of buckets containing in some cases tens of terabytes.
I did happen upon a python reference that looked very promising, but it doesn't look like the bucket.size parameter exists any longer. A code fragment is below. Any other suggestions of a python friendly API would be great.
Hi, ohflargen. If you want to do this for thousands of buckets, you'll likely have a better time by using Monitoring metrics. Specifically, storage/