I'm struggling to find information on how BigQuery API calls are charged. Is calling tables.get with STORAGE_STATS free, or is there some hidden cost I should be aware of?https://cloud.google.com/bigquery/docs/reference/rest/v2/tables/get
Solved! Go to Solution.
The tables.get
API call, when used with STORAGE_STATS
, does not process data from the tables themselves, but rather retrieves metadata about the tables. This means that it doesn't matter how large the table is—the amount of data processed for the API call doesn't change. This call fetches details about the storage usage of the table, which is a metadata operation and generally does not count against your data processing quota.
Therefore, using tables.get
for 10,000 tables, regardless of their size, would not incur costs associated with data scanning, as it does not actually scan table data but accesses metadata. It's still a good idea to monitor overall API calls if you're concerned about potential impacts on your quota. For comprehensive understanding, you may also consult the BigQuery API Quotas and Limits page.
For BigQuery, using tables.get
with the STORAGE_STATS
view is part of the API's general requests and does not incur additional charges. BigQuery offers a free tier covering up to 1TB of data processed by API requests per month, which includes calls like tables.get
. After exceeding this limit, charges would follow the standard BigQuery analysis pricing model. However, these calls are typically lightweight and unlikely to surpass this free allowance unless performed at an exceptionally high frequency. It's advisable to monitor your API usage to manage costs effectively.
For detailed pricing and more on API limits, visit:
So thats my understanding but how much data does it process and does it depend on the size of the table?
My concern is say I do table.get across 10,000 tables, and they are large tables. Is that going show up under billing because it’s scanning some hidden system view that I can’t see?
The tables.get
API call, when used with STORAGE_STATS
, does not process data from the tables themselves, but rather retrieves metadata about the tables. This means that it doesn't matter how large the table is—the amount of data processed for the API call doesn't change. This call fetches details about the storage usage of the table, which is a metadata operation and generally does not count against your data processing quota.
Therefore, using tables.get
for 10,000 tables, regardless of their size, would not incur costs associated with data scanning, as it does not actually scan table data but accesses metadata. It's still a good idea to monitor overall API calls if you're concerned about potential impacts on your quota. For comprehensive understanding, you may also consult the BigQuery API Quotas and Limits page.