Download Quota Exceeded for Drive Files using API and Service Account

Hello everyone. I have a service account integration with the Drive API. I have files and folders that are created programatically using the API. Each time a file is uploaded to drive, I save the files id, public link (the web content link) and a download link. 

Some of those files are video files. I then embed the video file inside a <video></video> tag using the file's download link but reformatted to include my API key like so: 

https://www.googleapis.com/drive/v3/files/{file_id}alt=media&key={api_key}

This was working fine until today (may have been a few days earlier but today's when I noticed it). 

This now throws the following error: 

{
  "error": {
    "code": 403,
    "message": "The download quota for this file has been exceeded.",
    "errors": [
      {
        "message": "The download quota for this file has been exceeded.",
        "domain": "usageLimits",
        "reason": "downloadQuotaExceeded"
      }
    ]
  }
}

However, I a confirm that this error is completely erroneous. Here's why: 
The file in question was uploaded moments ago. No one downloaded it. The total space consumed by the service account is currently 2.41Gb of the 2Tb shared space for the service account user. 
Is this a new thing with Google? Can anyone help overcome this issue? 

#Google Drive, #usageLimits, #downloadQuotaExceeded, #Erro403

3 15 1,737
15 REPLIES 15

I am having the same exact issue.  Started for me yesterday afternoon (February 12, 2024) out of the blue.  In Google Cloud console the Google Drive API shows the errors, but under "Quotas & System Limits" it shows I am well within the quotas.

Same thing here. This just started happening to me yesterday.     I'm using ESRI's Survey123 & Integromat/Make to generate feature reports, then email a download link.   Everyone started getting this error yesterday when trying to download. 

Same here as well. Started happening yesterday. 

It appears that the error is no longer there for me and things are working properly now.  I didn't change anything on my side.

Hello me to I have the same problem it never happened to me but it happened today. is there any fix? 

Hello, me to having the same problem. any fixes yet?

Exact same issue with my code.  First noticed yesterday.  The Google Drive files at issue for me are under 9MB and the Google Drive 'activity' report shows the files are getting extremely little use.

Working fine for me also now. No changes to my code. Huge relief!

Thanks to blasemedia for the 'good news' report.

I can confirm what @blasemedia said. The issue seems to have been resolved... however, I am skeptical about it. If it appeared out of nowhere, it can just as easily come back again. So I am going to present a few solutions I discovered over the last 24 hours while battling with this unforeseen madness. 

1) I discovered a Google Issue Tracker thread where a subset of this issue (general 403 response) was being discussed. It is not the same issue as we have been discussing here. Ours is a "quotaExceeded" situation, whereas the issue tracker was about not being able to embed assets on GD via the export URLs. I too faced this issue but my permanent solution was resorting to the thumbnail URL for image assets since the quality of the image wasn't my biggest concern. This may not work for everyone though... but here's a solution suggested in the issue tracker thread, which I have been using since around November 2023 with no issues: 
OLD LINE OF CODE:

<img src="https://drive.google.com/uc?export=view&id=1hMI7v0KpQF1gMJpu9S1GjWqZiL_wRO-U" alt="Page Image">


NEW LINE OF CODE:

<img src="https://drive.google.com/thumbnail?id=1hMI7v0KpQF1gMJpu9S1GjWqZiL_wRO-U&sz=w1000" alt="Page Image">

2. If you scroll down to the very bottom of the issue tracker thread, you'll notice that Google has responded saying "Expected Behaviour, Won't Fix" - meaning, the embedding of export URLs is a permanent issue and because our issue falls under the same 403 subset, we should take notice of the reason given by Google as to why they are doing this. Third party cookies (Really!!! FFS). So, the threat to all of us here using GD links to embed assets inside our Web Apps, is that while the quotaExceeded issue may have been resolved for now, we need to resort to a more permanent way of sorting this out for the future. 

My suggestions (non-exhaustive, IMHO only): 
1) Only use the thumbnail URL for image assets. Tested up to sz=w4000 (assuming your source image has that high of a resolution), which negates any quality loss. 
2) Transform download URLs to the following format for audio and video assets: 

 

https://www.googleapis.com/drive/v3/files/{FILE_ID}?alt=media&key={API_KEY} and attach the referrer policy attribute with "no-referrer" - I cannot be sure if it has any effect but seems to work for me. 
3) Write a generic function that will replace the URLs on the fly using the following solution, in the event this SH*T happens again: 
https://lienuc.com/
<video
+ crossorigin="anonymous"
controls>
</video>
4) Begin moving files to a more reliable CDN (yes, egress costs... I know... but it's better than losing sleep over such unannounced, badly reported errors). I am personally considering this: 
https://blog.cloudflare.com/introducing-r2-object-storage
 
Hope this helps... 

Lastly, there was a time when Google was the go-to for developers, or was it a Mandella effect or am I hallucinating?

Back to the grind folks!

UPDATE: Forgot to mention that when using the thumbnail URL for image assets, you need the referrerpolicy=no-referrer attribute, otherwise on page reload/refresh, the 403 error will rear its ugly head again. 

Nice that our code can now once again read files from Google Drive however we have to play by Google’s rules.

My observation for some time has been that if code tries to read too many files from Google Drive in too short of a time period, then Google will give your ip address a timeout for about 10 minutes.

The threshold for “too many” and “too short” seems fairly low.

When you are in a timeout condition if you enter this type of link:
https://www.googleapis.com/drive/v3/files/{FILE_ID}?alt=media&key={API_KEY}
into a browser, then you will likely see a message from Google saying:

We're sorry...
... but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.

Did you find a way around this 'too many requests' issue? 

No, I did not.

In my case I am building a PWA (GeoJPG) that displays maps offline.  Each map is a georeferenced JPG hosted on Google Drive.  The user can ask to install groups of JPGs which would result in a bunch of automated calls to Google Drive.

I just did a stress test and kept installing groups of JPGs as fast as I could.  Each group has from 2 to about 10 JPGs.  Group 5 almost finished and then Google cut me off. 

In my case, the only way I see to deal with this is user education.  So the first time in a session that a user tries to look at or install  a JPG hosted on Google Drive they will see a message informing them about the 'timeout' issue and tips for avoiding it.  That message only displays once per session.  Then if a timeout does happen they will see an informative message.

Edit: 30 minutes and counting.  Google Drive is still mad at me.

This is Madness (Persian to Leonidas)... really. 

I wish someone from Google would get in on this and explain why, despite using API keys or access tokens in headers, this issue still happens. The purpose of having an API is that programmatic queries/requests will be made. IF the user refreshes, and the request comes at 1-2 second intervals, then it still doesn't warrant a hard reject.. especially when you have 30 minutes and counting as you said. 

Shocking... 

I have this situation happening on my platform where its videos coming from GD and the user plays, wanders about, comes back and plays it again... maybe 2-5 minutes apart and boom! GD screams "Madness? This is SPARTA" and kicks me off. 

I did other work to let GD cool off.  All is now good about 1:45 min after GD threw a fit.  I do not know how long I was actually in google purgatory.

@Google - I am perfectly willing to write code to keep my users from exceeding the threshold you have set for making too many automated calls to GD in a short period of time.  By writing that code I help my users and I help you.  Win-Win.  I see no benefit to anyone by google keeping that threshold a secret.