I have a collection of parquet files that I need to upload to a bq table using the data transfer api. When I type my data for storage I use the following pyarrow field types in syc with bq types
```
{
"STRING": pa.string(),
"TIMESTAMP": pa.timestamp("ns"),
"NUMERIC": pa.decimal128(precision=38, scale=9),
"BIGNUMERIC": pa.decimal256(precision=76, scale=38),
"DECIMAL": pa.decimal128(precision=38, scale=9),
"BIGDECIMAL": pa.decimal256(precision=76, scale=38),
"INTEGER": pa.int32(),
"FLOAT": pa.float64(),
}
```
My data is written to gcs using my pyarrow schema but then when my data transfer runs, I get the error:
Provided Schema does not match Table $table_uri. Field $field has changed type from BIGNUMERIC to NUMERIC; JobID: $job_id
For field $field of the destination table, my schema.json has the value: `{"mode": "NULLABLE", "type": "BIGDECIMAL", "name": "$field"}`
Is there a way to define the type as BIGNUMERIC to avoid this error?