So i'm integrating some services with my organizations google voice, i'm using a service account to autenticate as myself, to access google vault api and create an export of the data that i need, that creates sucessfully and i have the permissions set up in my service account to download objects store.objects.get and store.objects.getIAMpolicy however i dont have permissions to access the bucket which the export points me to.
def create_export(start_time):
google_vault = googleapiclient.discovery.build('vault','v1',credentials=credentials)
body = {
"exportOptions": {
"voiceOptions": { # The options for Voice exports. # Options for Voice exports.
"exportFormat": "MBOX",
},
},
"name" : f"googleVoiceExport-{datetime.now()}",
"query" : {
'corpus': 'VOICE',
'dataScope': 'ALL_DATA',
'accountInfo':{
'emails':[
'censored@email.com'
]
},
'timeZone': 'xxxxxx',
'method': 'ACCOUNT',
'voiceOptions': {
'coveredData':
['TEXT_MESSAGES', 'VOICEMAILS', 'CALL_LOGS']
},
'startTime': start_time
}
}
with google_vault:
try:
export = (
google_vault
.matters()
.exports()
.create(matterId=google_voice_matter_id, body=body)
.execute()
)
return {"status": "success", "data": export}
except HttpError as err:
return {"status": "error", "data": f"{err}"}
And for downloading i pass a CloudStorageFile object
def get_export_zip(file_info):
logging.info(f"Attempting to download file {file_info['objectName']} from bucket {file_info['bucketName']}")
storage_client = storage.Client(project='project-id')
bucket = storage_client.bucket(file_info["bucketName"])
blob = bucket.blob(file_info["objectName"])
contents = blob.download_as_bytes()
return contents
For my authentication i point to the .json file for my service account, which has all the necessary permissions, and is also domain delegated to have the scopes it should need it has the 'Storage Object Viewer' Role as well as
and the domain delegation also grants it access to the scopes
https://www.googleapis.com/auth/ediscovery
https://www.googleapis.com/auth/ediscovery.readonly
https://www.googleapis.com/auth/devstorage.read_only
https://www.googleapis.com/auth/devstorage.full_control
https://www.googleapis.com/auth/cloud-platform
When trying to download the file i get
403 GET https://storage.googleapis.com/download/storage/v1/b/<bucketname>/o/<file>?alt=media: <service-account> does not have storage.objects.get access to the Google Cloud Storage object. Permission 'storage.objects.get' denied on resource (or it may not exist).: ('Request failed with status code', 403, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)
Its important to note i didnt create the bucket, im just using the information passed onto me by the google vault api
Solved! Go to Solution.
So i solved my issue by specifying the subject that i wanted to impersonate using domain wide delegation for my service account and the oauth scopes that i needed like so
SCOPES = ['https://www.googleapis.com/auth/ediscovery','https://www.googleapis.com/auth/devstorage.read_only']
SERVICE_ACCOUNT_FILE = 'path/to/service_account_file.json'credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES, subject='example@gmail.com'
)
def get_export_zip(file_info):
logging.info(f"Attempting to download file {file_info['objectName']} from bucket {file_info['bucketName']}")
# 👇 this is new!
storage_client = storage.Client(project='project_id',credentials=credentials)
bucket = storage_client.bucket(file_info["bucketName"])
blob = bucket.blob(file_info["objectName"])
contents = blob.download_as_bytes()
return contents
and it works now
Hi @Nesmb16,
Welcome to Google Cloud Community!
Please check if the IAM policy for the bucket is properly set by entering the following command:
gcloud storage buckets get-iam-policy gs://[bucket-name]
Based on the error that you provided, you are missing storage.objects.get permission. The bucket owner should grant at least Storage Object Viewer (roles/
) role.
Please check the following documentations for your reference:
Hope this helps.
Thank you for your response! My service account has Storage Object Viewer Access organization wide, for all resources, however the issue is that i didn't create the google cloud storage bucket nor do i have access to it, i created an export using the google vault api and the export files are being stored in that bucket, how do i access them? i should be able to given the docs, any help with this?
So i solved my issue by specifying the subject that i wanted to impersonate using domain wide delegation for my service account and the oauth scopes that i needed like so
SCOPES = ['https://www.googleapis.com/auth/ediscovery','https://www.googleapis.com/auth/devstorage.read_only']
SERVICE_ACCOUNT_FILE = 'path/to/service_account_file.json'credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES, subject='example@gmail.com'
)
def get_export_zip(file_info):
logging.info(f"Attempting to download file {file_info['objectName']} from bucket {file_info['bucketName']}")
# 👇 this is new!
storage_client = storage.Client(project='project_id',credentials=credentials)
bucket = storage_client.bucket(file_info["bucketName"])
blob = bucket.blob(file_info["objectName"])
contents = blob.download_as_bytes()
return contents
and it works now
User | Count |
---|---|
4 | |
3 | |
2 | |
1 | |
1 |