Hello, I'm trying to import a SQL script into a Postgres cloud SQL database as part of a deployment process, but am encountering permissions issues when trying to grant access for the cloud SQL service account to the bucket in which the import file is stored. I'm using the following shell commands:
SA_NAME=$(gcloud sql instances describe $DB_INSTANCE --project=$PROJECT_ID --format="value(serviceAccountEmailAddress)")
echo "Attempting to grant read access to Cloud SQL service account: "
echo $SA_NAME
gsutil acl ch -u ${SA_NAME}:R gs://$GCS_DEPLOYMENT_BUCKET
gsutil acl ch -u ${SA_NAME}:R gs://$FULL_GCS_PATH_TO_SCRIPT
However, when this is executed, I get the following error:
Failed to set acl for gs://mybucket/. Please ensure you have OWNER-role access to this resource.
Failed to set acl for gs://mybucket/artefacts/db/db_setup.sql. Please ensure you have OWNER-role access to this resource.
I have the owner role for the project in question. If I import the SQL file via the GCP console, it correctly assigns the roles for the cloud SQL service account and the import operation is successful, but I'd like to be able to script this as above. Am I missing anything from the steps in my script, or is there a better way to achieve this in an automated manner?
You may have Owner permissions but the user that the script uses does?
The user (service account) that the script uses to set the permissions (set acl...) looks like it doesn't have - and from a security perspective it shouldn't.
Not sure exactly why would you do it this way. I mean you can simply grant GCS Read perms through IAM (policy) to that SQL User and that'll be it.