Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Service account does not upload even with permission (Cloud Storage + IAM)

Hello! Could someone help me please?

I have created service accounts for external clients to upload files to a bucket.
Each service account is assigned directly and only its folder within the bucket.

A custom role with few permissions was created:

storage.buckets.get
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
storage.objects.update

The problem is that at the moment only one of the service accounts does not work, even with the same settings as the others.
The client tries to send the files via the Google SDK and fails.

I deleted and recreated the service account, put and removed the custom role, nothing worked.

In the last test the client managed to upload only one file, but the other gave the error that normally appears:

ERROR: (gcloud.storage.cp) [service-account@project.iam.gservice.com] does not have permission to access b instance [bucket] (or it may not exist): service-account@project.iam.gservice.com does not have storage.buckets.get access to the Google Cloud Storage bucket. Permission 'storage.buckets.get' denied on resource (or it may not exist). This command is authenticated as service-account@project.iam.gservice.com which is the active account specified by the [core/account] property.

I have attached some images to help.
If anyone can help please do so, I would appreciate it.

20250409_Error.jpg

20250409_GCS_Access.png20250409_GCS_Access_2.png

20250409_IAM_CustomRole.png

Solved Solved
0 3 319
1 ACCEPTED SOLUTION

Hello, @kensan.

Talking with support the problem was solved. I'll copy the answer here:

"We have identified something when working with large files, and that is that when you upload a large file using “gcloud storage cp” to optimize the action the tool switches to “Parallel composite uploads”, you can see more about this feature here[1]. 

Parallel composite uploads optimizes the request by dividing the file into 32 chunks, they are uploaded as temporary objects into the bucket and then it composes the file into one single file in the destination folder. This creates a discrepancy between the action and the permissions you have, because in order to upload the temporary files it needs to list the bucket, but the service account only has permissions to write inside the folder. 

I have some options you can use and I know they can help you to solve this problem.

  • If you think it is convenient, you can change the access of the service account to a bucket level, you can see how to apply your custom role to the service account here[2], that for sure will work and your account will be able to upload files to the folder. 

  • Now if giving permissions to a bucket level is not an option you can opt from not using Parallel composite upload, by doing that your file will be uploaded as a single file, but the command won’t need to list the bucket to upload the temporary files. 

To do that you can run the following command: gcloud config set storage/parallel_composite_upload_enabled False , you can see what this command does here on the documentation [3], once you run that command you can upload larger files."

[1]https://cloud.google.com/storage/docs/parallel-composite-uploads
[2]https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-add
[3]https://cloud.google.com/sdk/gcloud/reference/config/set

In my case, I choose for not using parallel composite upload thinking of not adding more permissions to the service account and it worked. 😀

Anyway, thank you very much for your help and attention. 🤝

View solution in original post

3 REPLIES 3

Hi @rrotter ,

Welcome to Google Cloud Community!

Based on the error you receive, your access account still doesn’t have the storage.buckets.get permission required. You need to check if the IAM policies applied to bucket level and inherited from the project level. 

To check bucket-level policy:

gcloud storage buckets get-iam-policy gs://[bucket-name] --format=json

To check project-level policy:

gcloud projects get-iam-policy [project-ID] --format=json

Another way To allow a service account to upload files only to a specific 'folder' (path prefix) within a bucket, grant it the roles/storage.objectCreator role combined with an IAM Condition that restricts object creation to that prefix

  1. Go to the Google Cloud Console -> Cloud Storage -> Buckets -> Select your bucket.
  2. Click the "PERMISSIONS" tab.
  3. Click "+ GRANT ACCESS".
  4. Enter the email address(es) of the user(s) or service account(s) you want to grant upload access to.
  5. Assign Role:
    1. Click "Select a role".
    2. Filter by typing "Object Creator".
    3. Select the Storage Object Creator (roles/storage.objectCreator) role.
  6. Add IAM Condition (Crucial Step):
    1. Click "+ ADD IAM CONDITION".
    2. Give it a descriptive title
    3. Select “Condition Editor".
    4. Condition: Use the resource.name attribute, which represents the full object path
       resource.name.startsWith("projects/_/buckets/[BUCKET NAME]/objects/[FOLDER NAME]")
  7. Click save
  8. Save the role binding.

This will allow the service account to upload files in the destination object starting with the folder name. But this role does not grant storage.objects.list. You can add the roles/storage.objectViewer also with condition.

Same steps mentioned above but change the assign role to "Object Viewer".

Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.

Hello, @kensan , how are you?
Thank you very much for your help.

I checked both the bucket-level and project-level policies and didn’t find any account with the roles/storage.objectCreator role, but I need to share some information gathered from tests I ran earlier this morning...

When the service accounts try to upload small files (~70MB), the upload works; however, when they try to upload larger files (~1GB), that’s when the error message mentioned in this thread appears.

Based on that, I decided to test uploading the 1GB file using my account, which has several permissions—unlike the service accounts, which are supposed to have minimal privileges—and it worked!

Now the question is: what can I do to allow these service accounts to do the same as my account, but with the least privilege possible? And which role should be assigned?

Once again, thank you for your attention.

Hello, @kensan.

Talking with support the problem was solved. I'll copy the answer here:

"We have identified something when working with large files, and that is that when you upload a large file using “gcloud storage cp” to optimize the action the tool switches to “Parallel composite uploads”, you can see more about this feature here[1]. 

Parallel composite uploads optimizes the request by dividing the file into 32 chunks, they are uploaded as temporary objects into the bucket and then it composes the file into one single file in the destination folder. This creates a discrepancy between the action and the permissions you have, because in order to upload the temporary files it needs to list the bucket, but the service account only has permissions to write inside the folder. 

I have some options you can use and I know they can help you to solve this problem.

  • If you think it is convenient, you can change the access of the service account to a bucket level, you can see how to apply your custom role to the service account here[2], that for sure will work and your account will be able to upload files to the folder. 

  • Now if giving permissions to a bucket level is not an option you can opt from not using Parallel composite upload, by doing that your file will be uploaded as a single file, but the command won’t need to list the bucket to upload the temporary files. 

To do that you can run the following command: gcloud config set storage/parallel_composite_upload_enabled False , you can see what this command does here on the documentation [3], once you run that command you can upload larger files."

[1]https://cloud.google.com/storage/docs/parallel-composite-uploads
[2]https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-add
[3]https://cloud.google.com/sdk/gcloud/reference/config/set

In my case, I choose for not using parallel composite upload thinking of not adding more permissions to the service account and it worked. 😀

Anyway, thank you very much for your help and attention. 🤝