I have been using BigQuery for a while now, but since this month, one of my projects drastically increased in costs to almost 10 euro's a day. One day it even managed to hit 50 euros. I would like to find out how this project uses 12.54 tebibyte in one week. The SKU is "Analysis" and the service is "BigQuery". How can I find out which queries / jobs are adding the most to this total costs?
I have looked in the Logs Explorer, Monitoring, Billing reports, but could not find it.
Solved! Go to Solution.
Hey @Parthis, I would take a closer look at question #5 from our Cloud FinOps FAQs. The Job Execution Report (as outlined in this blog) provides a per-job breakdown of slot utilization, among other job statistics. It allows you to drill down into individual jobs or understand trends in a specific group of jobs.
Hope this helps! And for future reference, you can engage in BigQuery-centric questions and conversations in our Data Analytics forum here 😀
Hey @Parthis, I would take a closer look at question #5 from our Cloud FinOps FAQs. The Job Execution Report (as outlined in this blog) provides a per-job breakdown of slot utilization, among other job statistics. It allows you to drill down into individual jobs or understand trends in a specific group of jobs.
Hope this helps! And for future reference, you can engage in BigQuery-centric questions and conversations in our Data Analytics forum here 😀
User | Count |
---|---|
33 | |
18 | |
5 | |
4 | |
3 |