Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

BigQuery csv file upload error: "Invalid value at 'job.configuration.load.schema.fields[0]' (Map)"

Hi,

I have tables for which I upload csv files on a regular basis where I append the csv files to existing tables, but since yesterday, I have been unable to do so as I am getting the error message below.

The file I am uploading is indeed a csv and has the same schema (I have created a similar table and the schema came out the same as the existing table I was trying to append).

I have never seen this before, does anyone know what this mean?

Show More
" Invalid value at 'job.configuration.load.schema.fields[0]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[1]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[2]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[3]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[4]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[5]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[6]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[7]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[8]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[9]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[10]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[11]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. Invalid value at 'job.configuration.load.schema.fields[12]' (Map), Cannot bind a list to map for field 'dataGovernanceTags'. "

M3l1ssa_0-1718300441262.png

M3l1ssa_1-1718300496054.png

 

Solved Solved
3 2 2,024
1 ACCEPTED SOLUTION

I found the error.

The column headers of the csv file I upload contain spaces.

i.e. one of the column headers in the csv is 'Completed Date'.

In the past, BigQuery would upload this as Field Name = Completed_Date. Anytime I would upload a csv file, the header would remain the same ('Completed Date') and would go through without any issues.

Something must have changed since June 3 (last time I uploaded any csv files), as now, BigQuery uploads the same Field without the underscore hence the error I get (which sucks to get as an error as it doesn't explain what is a very simple error).

The field name is now 'Completed Date'.

See screenshot below, on the left is the schema and field names the way they would get uploaded. On the right, is how it shows and where the column headers from csv files now have to match 100% field names.

M3l1ssa_1-1718392657053.png

 

View solution in original post

2 REPLIES 2

The error message "Invalid value at 'job.configuration.load.schema.fields[0]' (Map)" typically indicates a mismatch between the schema of your CSV file and the schema of the BigQuery table you're trying to append to. Here’s a breakdown of the issue and potential solutions:

Schema Mismatch

  • Field Names: Ensure the column names in your CSV file exactly match the field names in your BigQuery table schema. BigQuery is case-sensitive and expects precise alignment.

  • Data Types: Verify that the data types in your CSV match the data types defined in your BigQuery table schema. Even a slight mismatch can trigger this error.

  • Field Order: Check if the order of columns in your CSV file corresponds to the order of fields in your BigQuery table schema.

Map Interpretation

  • The error suggests that BigQuery might be trying to interpret one of your CSV columns as a complex data type (like a map or dictionary) instead of a simple type (like string, integer, or float).

Troubleshooting Steps

Verify Schema Compatibility

  1. Field Names: Ensure that the column names in your CSV file exactly match the field names in your BigQuery table schema. Pay attention to case sensitivity and special characters.

  2. Data Types: Confirm that the data types in your CSV align with the data types specified in your table schema.

  3. Field Order: Ensure that the order of columns in your CSV matches the order of fields in your table schema.

Inspect CSV File

  1. Hidden Characters: Open your CSV file in a text editor and look for any hidden characters, extra spaces, or unusual delimiters that might be confusing BigQuery.

  2. First Row: Make sure the first row of your CSV contains the column headers. If there are no headers, ensure the "Header rows to skip" setting is correct.

  3. Complex Data Types: If your CSV has columns with complex data (arrays, structs, etc.), ensure the corresponding fields are correctly defined in your BigQuery table schema.

Check for Recent Changes

  1. Table Schema: Verify if there have been recent changes to the BigQuery table schema. If the schema was modified, consider reverting the changes and trying the upload again.

  2. BigQuery Updates: Occasionally, BigQuery updates its internal behavior. Check the BigQuery release notes for any recent updates that might affect your upload process.

Recreate Table (if possible)

  • If the table isn't too large or critical, consider recreating it with the correct schema and retrying the upload.

Using the BigQuery UI

In the BigQuery UI for loading data:

  1. Schema: Make sure you've either selected the correct schema or manually defined it to match your CSV file's structure.

  2. File Options: Ensure that "Header rows to skip" is set to '1' if your CSV has a header row. Verify that the "Field delimiter" matches the delimiter used in your CSV (typically a comma).

Example: Uploading CSV with the UI

In the provided screenshot, make sure:

  1. "Auto detect" is NOT selected in the Schema section.

  2. Manually create schema fields:

    • Click "Add Field" to define each schema field.

    • Enter the field name (matching the column header from your CSV file).

    • Select the appropriate data type.

    • Click "Create Table".

I found the error.

The column headers of the csv file I upload contain spaces.

i.e. one of the column headers in the csv is 'Completed Date'.

In the past, BigQuery would upload this as Field Name = Completed_Date. Anytime I would upload a csv file, the header would remain the same ('Completed Date') and would go through without any issues.

Something must have changed since June 3 (last time I uploaded any csv files), as now, BigQuery uploads the same Field without the underscore hence the error I get (which sucks to get as an error as it doesn't explain what is a very simple error).

The field name is now 'Completed Date'.

See screenshot below, on the left is the schema and field names the way they would get uploaded. On the right, is how it shows and where the column headers from csv files now have to match 100% field names.

M3l1ssa_1-1718392657053.png