Hi,
We are working to estimate costs for applying Object Lifecyle rules across multiple Cloud Storage buckets in a project.
We know from the GCP Metrics Explorer that there are roughly 16.8 million objects between all the buckets in the project. Our team is looking to move a subset of those objects to Coldline storage using object lifecycle management rules. We also understand that SetStorageClass counts as a Class A Operation and will incur charges.
However we are not sure how many objects in those buckets will be impacted by the new Object Lifecyle Rules. For example, we want to move all objects that are older than 21 days to Coldline storage automatically. That policy won't impact all 16.8 million objects, just the ones older than 21 days.
Is there a way to estimate how many objects will be impacted by a Object Lifecyle rule? Or is there a way to get a count of objects that are older than X days on multiple buckets?
We believe we can do this for a single bucket using Storage Insights inventory reports, but have not found a mechanism to do this at scale with the hundreds of buckets we have in the project.
Best,
Hi @andrewdelave66,
Welcome to Google Cloud Community!
Challenges:
Approaches:
Aggregated Storage Class Metrics:
Storage Insights Inventory Reports:
Firebase Extensions for Object Lifecycle Management (Experimental):
Third-Party Tools:
Considerations:
SetStorageClass
costs by applying rules gradually and monitoring costsRecommendations:
Hi Christian,
Thank you for the reply this is great! Can you please provide some more detail on Approaches #3 and #4?
Specifically for #3 on the Firebase Extensions is there any documentation you can share on how to use these? Would we we need to request preview access?
For #4, do you know what third-party tooling might support this functionality currently?
Best,