So, i think Google documentation is confusing me... says you can have up to 100,000 load JOBS a day, nowhere do i see a limit on number of rows, yet, when i try to load a CSV file w/more than 100k rows, i only get the first 100k rows in the table, no warning re/the rest etc... I use this :
> bq load --allow_quoted_newlines=true --replace --source_format=CSV --autodetect x.xxx xxx.csv
Any thoughts? I'm sure i'm missing something 😞
thx J
BigQuery does not impose a limit on the number of rows you can load into a table. However, there might be an error in the data you are trying to load. If there's an error in the data past the 100k row mark, BigQuery may be stopping the load process. BigQuery's error messages should help you identify if this is the case.
ok - thx, I'll look into that
actually forgot to update here, SuiteQL will return up to 100k rows (standard level).
so, either loop / chunk at 100k, or better, if possible, use created-date or last-modified-date to filter and load into temp table and merge into the main table. this turns out to be much quicker in the end