Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Give permission to list the files in folder of bucket

So, I want to grant a user to access a folder in a bucket.

So a user with user-id `A` can access <bucket>/<folder for storing>/<user-id>/

So, the requirement

  1. upload files to the folder in bucket.
  2. Reupload the file to the folder in bucket.
  3. view files in the bucket.
  4. (Added in edit) user must not see files in other folder.

 

Through this IAM condition documentation, I can achieve requirements 1 and 2.

  • Role: Storage Object Admin (predefined role)
  • Condition:  resource.name.startsWith('projects/_/buckets/<bucket-name>/objects/<folder for storing>/<user-id>/')
The problem is the user cannot list the files. Please check image below.
peterbekti_0-1650258394683.png

 

 

 

 

 

 Additional permissions required to list objects in this bucket. Ask a bucket owner to grant you 'storage.objects.list' permission.

 

 

 

 

 

But, user can upload the files, and reupload the file if user knows the exact name.

peterbekti_1-1650258671918.png

 

Point to note

  1. I am sure that the role that is given above already has permission to list object, because when I remove the condition, files can be viewed.

I have tried to look on the documentation, but I haven't get the answer.  

Solved Solved
1 4 6,931
1 ACCEPTED SOLUTION

As mentioned at documentation:

"

  • Since the storage.objects.list permission is granted at the bucket level, you cannot use the resource.name condition attribute to restrict object listing access to a subset of objects in the bucket. Users without storage.objects.list permission at the bucket level can experience degraded functionality for the Console and gsutil.

"

Following least privilege principle I'd recommend to assign storage/objectViewer  at bucket level for your users accounts.

For testing the results I'd recommend to run gsutil ls gs://<bucket_name>/<folder for storing>/<user-id>/ as the user you want to have access to the bucket [1].  Then you should be able to list the folder's content.

 

As well you should be able to list the folder content within Cloud Console UI.

1- https://stackoverflow.com/questions/56658640/is-there-a-way-to-use-gsutil-while-impersonating-a-serv...

View solution in original post

4 REPLIES 4

As mentioned at documentation:

"

  • Since the storage.objects.list permission is granted at the bucket level, you cannot use the resource.name condition attribute to restrict object listing access to a subset of objects in the bucket. Users without storage.objects.list permission at the bucket level can experience degraded functionality for the Console and gsutil.

"

Following least privilege principle I'd recommend to assign storage/objectViewer  at bucket level for your users accounts.

For testing the results I'd recommend to run gsutil ls gs://<bucket_name>/<folder for storing>/<user-id>/ as the user you want to have access to the bucket [1].  Then you should be able to list the folder's content.

 

As well you should be able to list the folder content within Cloud Console UI.

1- https://stackoverflow.com/questions/56658640/is-there-a-way-to-use-gsutil-while-impersonating-a-serv...

Thanks a lot for the reply.

I have tried your suggestion, and yes it works, user A can view the files in the folder along with other folder in same bucket.

But, I haven't mentioned additional requirement that user A must not see files in other folder in same bucket.

So, I think it cannot be done by Google Cloud UI, and it seems we will develop other way for workaround.

 

Again thanks a lot for answering.

I have the same request, which needs to allow a service account to have the 

roles/storage.objectViewer on a specific path on a Google storage bucket. However, it seems the 
roles/storage.objectViewer only works on the bucket level with the conditional role.

If I specified the storage.objectViewer role with a condition,
"resource.name.startsWith('projects/_/buckets/${bucket-name}/objects/${path-name}", I got the error, "${service-account} does not have storage.objects.list access to the Google Cloud Storage bucket"


I use the following steps to address the folder access requirement.
1. Add a custom IAM role to have the storage.objects.list permission only. 
Here is a sample terragrunt code,

include {
  path = find_in_parent_folders()
}

terraform {
  source = "git@bitbucket.org:xxxxx//gcp/iam/custom_role"
}

inputs = {
  role_id     = "customStorageObjectsList"
  title       = "Custom Storage Objects List"
  description = "Custom Storage Objects List Role"
  permissions = [
    "storage.objects.list"
  ]
}

2. Grant the customStorageObjectsList role and the storage.objectViewer role to an IAM member, such as an IAM service account.

Here is a sample terragrunt code,

include {
  path = find_in_parent_folders()
}

terraform {
  source = "git@bitbucket.org:xxxxx//gcp/iam/member"
}

inputs = {
  conditional_roles = [
    {
      role        = "roles/customStorageObjectsList"
      title       = "xxx service-account storage.objects-list permission"
      description = "xxx service-account storage.objects-list permission"
      expression  = "resource.name.startsWith('projects/_/buckets/${bucket_name}')"
    },
    {
      role        = "roles/storage.objectViewer"
      title       = "xxx service-account storage.objectViewer permission"
      description = "xxx service-account storage.objectViewer permission"
      expression = join("", [
        "resource.name.startsWith('projects/_/buckets/${bucket_name}/objects/folder-a')",
        "resource.name.startsWith('projects/_/buckets/${bucket_name}/objects/folder-b')"
      ])
    }
  ]
  member = "serviceAccount:${service-account.email_address}"
}
 
Thus, the service account will have the storage.objects.list permission on the bucket level and have the storage.objects.get permission on the folder level (folder-a and folder-b) only.