Export with gcloud sql export but service role with storage admin & storage object admin not enough?

I'm using a service role that has storage admin and storage object admin project-wide.

But when I try to do a gcloud sql export to a bucket in that project, it returns a 403 error.

 

(gcloud.sql.export.csv) HTTPError 403: The service account does not have the required permissions for the bucket.

 

What further permissions would it need to write the export there?

Solved Solved
0 7 2,790
1 ACCEPTED SOLUTION

The roles/storage.objectAdmin role indeed covers the permissions of roles/storage.objectCreator, so adding objectCreator would be redundant in this case.

Given the extensive permissions you've already granted, it's surprising that you're still encountering a 403 error. This suggests that there might be another factor at play.

Here are a few things to consider:

  1. Cloud SQL Service Account: The Cloud SQL instance uses its own service account to interact with other GCP services. When you perform an export, it's this service account that needs the permissions to write to the Cloud Storage bucket, not necessarily the service account you're using to run the gcloud command. Ensure that the Cloud SQL instance's service account has the necessary permissions on the bucket.

  2. Bucket Policy: Check if the bucket has a uniform bucket-level access (UBLA) policy. If UBLA is enabled, you cannot assign ACLs to individual objects. Ensure that the service account has the necessary permissions at the bucket level.

  3. Bucket Location: Ensure that there are no restrictions on the bucket related to locations. For instance, if the bucket has a location constraint, ensure that the Cloud SQL instance is in a region that's allowed to write to that bucket.

  4. Organization Policies: It's possible that there are organization-wide policies in place that restrict certain actions, even if the IAM permissions seem correct. Check with your GCP organization admin to see if there are any policies that might be affecting this operation.

  5. Service Account Activation: Ensure that the service account you're using with gcloud is correctly activated. You can verify this with gcloud auth list and ensure the service account is the currently active account.

  6. Service Account Key: If you're using a service account key to authenticate, ensure that the key hasn't been revoked or expired.

  7. Bucket's IAM Policy Binding: Sometimes, even if you add a role to a service account, it might not take effect immediately. Recheck the bucket's IAM policy bindings to ensure the service account is correctly listed with the roles you've assigned.

Given all the permissions you've added, I'd strongly suspect the issue lies with the Cloud SQL instance's service account. Ensure that this service account has the necessary permissions on the bucket. If the Cloud SQL service account is the issue, granting it the roles/storage.objectAdmin role for the specific bucket should suffice.

View solution in original post

7 REPLIES 7

The error message indicates that the service account you are using does not have the required permissions to write to the bucket you are exporting to. Even though the service account has the roles/storage.admin and roles/storage.objectAdmin roles, these might not be sufficient for exporting data from Cloud SQL to Cloud Storage.

To export data from Cloud SQL to Cloud Storage, the service account should have the roles/storage.legacyBucketWriter role on the bucket. This role allows writing objects to the bucket but doesn't allow listing them.

To grant the roles/storage.legacyBucketWriter role to the service account:

  1. Go to the Cloud Storage browser.
  2. Click the bucket that you want to export the data to.
  3. Click the Permissions tab.
  4. Click Add.
  5. In the New members field, enter the email address of the service account.
  6. In the Role drop-down list, select "Storage > Legacy Bucket Writer".
  7. Click Save.

After granting the role, you should be able to export data from Cloud SQL to the bucket without errors.

Note: The roles/storage.legacyBucketWriter role is a legacy role and might not be recommended for all use cases. For more granular control, consider using roles like roles/storage.objectCreator or roles/storage.objectAdmin.

Additional troubleshooting tips:

  • Ensure the service account has the necessary permissions for Cloud SQL operations.
  • While it's beneficial for the Cloud Storage bucket to be in the same project as the Cloud SQL instance, it's not a strict requirement.
  • Check if the bucket has a retention policy or is locked, which might impact object management.

Thanks for the suggestion!

We've now added:

Storage Admin
Storage Legacy Bucket Owner
Storage Legacy Bucket Reader
Storage Legacy Bucket Writer
Storage Legacy Object Owner
Storage Legacy Object Reader
Storage Object Admin

And still get the `HTTPError 403: The service account does not have the required permissions for the bucket.` error 

We could try storage.objectCreator; does objectAdmin not already cover that though?

Do you think it's just the service role I'm using that needs the perms, or does the CloudSQL Postgres instance role also need write perms to the bucket for the gcloud sql export to work?

The roles/storage.objectAdmin role indeed covers the permissions of roles/storage.objectCreator, so adding objectCreator would be redundant in this case.

Given the extensive permissions you've already granted, it's surprising that you're still encountering a 403 error. This suggests that there might be another factor at play.

Here are a few things to consider:

  1. Cloud SQL Service Account: The Cloud SQL instance uses its own service account to interact with other GCP services. When you perform an export, it's this service account that needs the permissions to write to the Cloud Storage bucket, not necessarily the service account you're using to run the gcloud command. Ensure that the Cloud SQL instance's service account has the necessary permissions on the bucket.

  2. Bucket Policy: Check if the bucket has a uniform bucket-level access (UBLA) policy. If UBLA is enabled, you cannot assign ACLs to individual objects. Ensure that the service account has the necessary permissions at the bucket level.

  3. Bucket Location: Ensure that there are no restrictions on the bucket related to locations. For instance, if the bucket has a location constraint, ensure that the Cloud SQL instance is in a region that's allowed to write to that bucket.

  4. Organization Policies: It's possible that there are organization-wide policies in place that restrict certain actions, even if the IAM permissions seem correct. Check with your GCP organization admin to see if there are any policies that might be affecting this operation.

  5. Service Account Activation: Ensure that the service account you're using with gcloud is correctly activated. You can verify this with gcloud auth list and ensure the service account is the currently active account.

  6. Service Account Key: If you're using a service account key to authenticate, ensure that the key hasn't been revoked or expired.

  7. Bucket's IAM Policy Binding: Sometimes, even if you add a role to a service account, it might not take effect immediately. Recheck the bucket's IAM policy bindings to ensure the service account is correctly listed with the roles you've assigned.

Given all the permissions you've added, I'd strongly suspect the issue lies with the Cloud SQL instance's service account. Ensure that this service account has the necessary permissions on the bucket. If the Cloud SQL service account is the issue, granting it the roles/storage.objectAdmin role for the specific bucket should suffice.

Thank you!  adding the roles/storage.objectAdmin role to the cloudSQL service account did the trick!

Not solved.

Tried steps mentioned above but getting same error please help. All permission added

 

AccessDeniedException: 403 (PII Removed by Staff) does not have storage.objects.list access to the Google Cloud Storage bucket. Permission 'storage.objects.list' denied on resource (or it may not exist).

The error message you're encountering indicated a lack of storage.objects.list access which is a different issue from the original post which was related to storage.objects.create or storage.objects.write permissions. 

Here are the steps to resolve your issue:

  1. Grant List Permission: The service account needs the roles/storage.objectViewer role for the bucket, which includes the storage.objects.list permission. This role allows the service account to list objects in the bucket.

  2. Assign the Role to the Cloud SQL Service Account:

    • Go to the Cloud Storage section in the Google Cloud Console.
    • Click on the bucket where you want to export the data.
    • Click on the "Permissions" tab.
    • Click on the "Add members" button.
    • In the "New members" field, enter the email address of the Cloud SQL service account.
    • In the "Role" dropdown, select Storage Object Viewer.
    • Click "Save".
  3. Verify Permissions:

    • After adding the role, verify that the Cloud SQL service account has the correct permissions. You can view the permissions of the service account in the IAM & Admin section of the Google Cloud Console.
  4. Retry the Export:

    • Once you have updated the permissions, try the export operation again.
  5. Check for Other Restrictions:

    • Ensure there are no other bucket-level or organization-level policies that might be restricting access.
  6. Review Error Messages:

    • If you encounter further errors, carefully review the error messages for specific details about the permissions or resources involved.