Hi @nids_k,
I hope I understand your question!
Yes, you can cast a column to bignumeric
in Spark SQL DataFrame using the cast
function. Here is an example:
df = spark.sql("""
SELECT cast(col1 as bignumeric(38,20)) as col1
from table1
""")
This code will create a DataFrame with a column called col1
that is of type bignumeric
with 38 digits of precision.
Please note that the cast
function can only be used to cast a column to a type that is supported by the target data source. In this case, the target data source is BigQuery, and bignumeric
is a supported data type.
Let me know how it works out!
How to write data from PySpark into BigQuery BigNumeric datatype?
I have defined a column as DecimalType(38,18) in PySpark. But it is only compatible with Numeric datatype of BigQuery. What datatype in PySpark needs to be used to write to a BigNumeric column in BigQuery ?