I have a collection of parquet files that I need to upload to a bq table using the data transfer api. When I type my data for storage I use the following pyarrow field types in syc with bq types
```
Hi @Dulius,
Welcome back to Google Cloud Community.
You may try to define the type in your schema as BIGNUMERIC. The error message explains that your schema no longer matches the destination table because a field's type changed from BIGNUMERIC to NUMERIC. Update your schema to include the BIGNUMERIC type for the impacted field to resolve it.
This will make the field's BIGNUMERIC type accurately detected, which should match the field's counterpart in the destination table.
Use the types that match the target BigQuery types when defining your PyArrow schema, just as you put in your original code. This will ensure that your data is interpreted and loaded properly.
Here are some references that might help you.
https://cloud.google.com/bigquery/docs/reference/datatransfer/rest
https://cloud.google.com/bigquery/docs/tables
Hi @Aris_O,
Thanks for getting back to me. It turns out that I had to manually specify decimal target types in my transfer configuration. Once I added in the `decimal_target_types` flag the transfer correctly inferred the correct dtype. An example of the correct transfer configuration option below:
```
TransferConfig(
params={
"decimal_target_types": "NUMERIC,BIGNUMERIC"
}
)
I'm guessing that if the param isn't specified, it default to `NUMERIC`
Supporting documentation for those that encounter the same issue: