Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Looker dashboard cost

Hi, 

I am trying to get cost of queries being running in background of looker studio dashboard, I tried using below query.

with looker_studio_job_info as (
  select
    job_id,
    creation_time,
    user_email as datasource_owner,
    labels,
    referenced_tables,  
    query,  
  from `region-eu`.INFORMATION_SCHEMA.JOBS
  where (select l.value from unnest(labels) l where l.key = "requestor")="looker_studio"
  and creation_time >= timestamp_add(current_timestamp, interval -10 day)
)

  select
    *,
    round(bytes_processed / 1024 / 1024 / 1024, 2) as total_processed_bytes_gib,
    date(creation_time) as creation_date
  from looker_studio_job_info
  left join (
    select
      job_id,
      sum(total_bytes_processed) as bytes_processed
    from `region-eu`.INFORMATION_SCHEMA.JOBS_BY_PROJECT  
    where creation_time >= timestamp_add(current_timestamp, interval -10 day)
    group by job_id) as job_bytes_processed on looker_studio_job_info.job_id = job_bytes_processed.job_id
 
In some of my projects referenced_tables field is giving mutliple project_id, table_id in referenced_tables field(obviously that they might be referring in some manner) , how to get cost specific to a gcp project, also how these queries would be charged since i have the bytes processed
0 3 602
3 REPLIES 3

I haven't run the query but it looks about right.  What does a table output look like?  When you use Looker Studio, I believe that you specify a project that will be billed for any BigQuery queries.  When you examine INFORMATION_SCHEMA.JOBS_BY_PROJECT, you are reading about the history of jobs executed within THAT project.  When you run a query, the compute cost of that query is attribute to a single identified project irrespective of which tables are examined.  The referenced_tables field lists that projects, datasets and tables that were used to build the results but they aren't at all related to the project that will be billed for the query.

When I run this query I get the jobs within the timeframe along, with owner of the datasource, amount of bytes processed in gb but I am not sure what amount should be used to get the cost of it.

If we look at this page:

https://cloud.google.com/bigquery/pricing

We see the BigQuery pricing page.  Assuming that you are running "on-demand", then you are charged by billable bytes processed.  The charge for US regions is $6.25/TB processed.  As such, you could sum up the bytes scanned for your jobs and perform the arithmetic.

Another way to determine costs is to use either the Billing Dashboard or export the billing data to a table and then run a Looker Studio report over it.  Here is an article on performing that task:

https://cloud.google.com/billing/docs/how-to/visualize-data