Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

What's default Scale and Precision of Numeric type in BigQuery

What's the default Scale and Precision of Numeric type in BigQuery when I write select cast(12.12 to Numeric) and create a table from this query?

0 5 7,658
5 REPLIES 5

When you perform a select operation like select cast(12.12 to Numeric) and create a table from this query in BigQuery, the default precision and scale for the NUMERIC data type will be 38 and 9, respectively.

Here's what these terms mean:

  • Precision is the total number of digits that a number can have.
  • Scale is the number of digits that can be after the decimal point.

So, for the NUMERIC data type in BigQuery, the default precision is 38, meaning a number can have up to 38 digits in total. The default scale is 9, which means a number can have up to 9 digits after the decimal point.

How about the BIGNUMERIC type?

The BIGNUMERIC type in Google Cloud BigQuery is an extension of the standard NUMERIC type, designed to handle even larger numbers. 

When you perform operations like SELECT CAST(12.12 AS BIGNUMERIC), BigQuery will cast the number 12.12 to a BIGNUMERIC type with these settings. If you create a table from this query, the column with the casted BIGNUMERIC value will inherit these precision and scale settings.

The BIGNUMERIC type is particularly useful in scenarios where extremely large numbers or high precision calculations are required, which exceed the capabilities of the standard NUMERIC type.


@Stanislav wrote:

BigQuery will cast the number 12.12 to a BIGNUMERIC type with these settings


Sorry, I'm not following. Is it precision 38 scale 9 as with the NUMERIC, or some other value?

Hello All,

I think this docs should help - 

 

 
For BIGNUMERIC BIGDECIMAL the Precision is 76.76 and Scale 38. 
 
Hope this helps!