Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Unable to handle bignumeric in spark

I am facing below issue in spark code /
We are running spark code using dataproc serverless batch in google cloud platform. Spark code is causing issue while writing the data to bigquery table. In bigquery table , few of the columns have datatype as bignumeric and spark code is changing the datatype from bignumeric to numeric while writing the data. We need datatype to be kept as bignumeric only as we need data of 38,20 precision

 
Could we cast a column to bignumeric in spark sql dataframe like below code for decimal:
we have used below code while loading the data to bignumeric column in bigquery
 
df= spark.sql("""SELECT cast(col1 as decimal(38,20)) as col1 from table1""")
 
 
1 2 2,153
2 REPLIES 2