Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Python API for Bucket Size

I need to go through several hundred GCP projects and total the GCS Bucket size for some reporting.

Using bash scripts and "gsutil du -s <bucket>" is not efficient when working through thousands of buckets containing in some cases tens of terabytes.

I did happen upon a python reference that looked very promising, but it doesn't look like the bucket.size parameter exists any longer. A code fragment is below. Any other suggestions of a python friendly API would be great.

 
# Create a service object
service = googleapiclient.discovery.build('storage', 'v1')

# Call the list_buckets method
buckets = service.buckets().list(project='my_project').execute()

# Iterate over the buckets
for bucket in buckets['items']:
# Get the bucket size
size = bucket['sizeBytes']
1 1 1,078
1 REPLY 1

Hi, ohflargen.   If you want to do this for thousands of buckets, you'll likely have a better time by using Monitoring metrics. Specifically, storage/total_bytes metric.  You can use the Monitoring python client to download these automatically or just create a dashboard in Monitoring to identify top buckets, for example.