Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Can we resolve bigquery API memory limit issues with getting more slots?

Hi, I am currently extracting my GA4 data from bigquery using bigquery APIs. 
As the GA4 daily hits goes over 2millions on my account, I had to slice it down to 10k rows per each API request to process the data. 

the problem occurs here when I used the order by clause which is trying to sort over 2million records. 

the error statements is like 


error occured: 400 Resources exceeded during query execution: The query could not be executed in the allotted memory. Peak usage: 100% of limit.
Top memory consumer(s):
  ORDER BY operations: 99%
  other/unattributed: 1%

I am using order by clause because the bigquery API will return the different 10k rows every each request without it. 

so my question is..

1. would this resolve the memory issue?(https://cloud.google.com/bigquery/docs/reservations-commitments)

2. Is there any other better way that I could get 10k record in order? I am currently doing 

ORDER BY USER_PSEUDO_ID, EVENT_TIMESTAMP, GA_SESSION_ORDER
LIMIT 100000 OFFSET ...




0 2 371
2 REPLIES 2