Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Google Vault Exports and Google Cloud Storage API Issues

So i'm integrating some services with my organizations google voice, i'm using a service account to autenticate as myself,  to access google vault api and create an export of the data that i need, that creates sucessfully and i have the permissions set up in my service account to download objects store.objects.get and store.objects.getIAMpolicy however i dont have permissions to access the bucket which the export points me to. 

 

def create_export(start_time):
google_vault = googleapiclient.discovery.build('vault','v1',credentials=credentials)

body = {
"exportOptions": {
"voiceOptions": { # The options for Voice exports. # Options for Voice exports.
"exportFormat": "MBOX",
},
},
"name" : f"googleVoiceExport-{datetime.now()}",
"query" : {
'corpus': 'VOICE',
'dataScope': 'ALL_DATA',
'accountInfo':{
'emails':[
'censored@email.com'
]
},
'timeZone': 'xxxxxx',
'method': 'ACCOUNT',
'voiceOptions': {
'coveredData':
['TEXT_MESSAGES', 'VOICEMAILS', 'CALL_LOGS']
},
'startTime': start_time
}
}

with google_vault:
try:
export = (
google_vault
.matters()
.exports()
.create(matterId=google_voice_matter_id, body=body)
.execute()
)

return {"status": "success", "data": export}

except HttpError as err:

return {"status": "error", "data": f"{err}"}

 

And for downloading i pass a CloudStorageFile object 

def get_export_zip(file_info):
logging.info(f"Attempting to download file {file_info['objectName']} from bucket {file_info['bucketName']}")
storage_client = storage.Client(project='project-id')
bucket = storage_client.bucket(file_info["bucketName"])
blob = bucket.blob(file_info["objectName"])
contents = blob.download_as_bytes()

return contents

For my authentication i point to the .json file for my service account, which has all the necessary permissions, and is also domain delegated to have the scopes it should need it has the 'Storage Object Viewer' Role as well as 

  • storage.buckets.createTagBinding
  • storage.buckets.get
  • storage.buckets.getIamPolicy
  • storage.buckets.getObjectInsights
  • storage.buckets.list
  • storage.buckets.update 

and the domain delegation also grants it access to the scopes 

https://www.googleapis.com/auth/ediscovery

https://www.googleapis.com/auth/ediscovery.readonly

https://www.googleapis.com/auth/devstorage.read_only

https://www.googleapis.com/auth/devstorage.full_control

https://www.googleapis.com/auth/cloud-platform

When trying to download the file i get 

 403 GET https://storage.googleapis.com/download/storage/v1/b/<bucketname>/o/<file>?alt=media: <service-account> does not have storage.objects.get access to the Google Cloud Storage object. Permission &#39;storage.objects.get&#39; denied on resource (or it may not exist).: ('Request failed with status code', 403, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)

 Its important to note i didnt create the bucket, im just using the information passed onto me by the google vault api

Solved Solved
2 3 1,178
1 ACCEPTED SOLUTION

So i solved my issue by specifying the subject that i wanted to impersonate using domain wide delegation for my service account and the oauth scopes that i needed like so  

 

SCOPES = ['https://www.googleapis.com/auth/ediscovery','https://www.googleapis.com/auth/devstorage.read_only']

SERVICE_ACCOUNT_FILE = 'path/to/service_account_file.json'credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES, subject='example@gmail.com'
)

 

def get_export_zip(file_info):
logging.info(f"Attempting to download file {file_info['objectName']} from bucket {file_info['bucketName']}")

#                                                                                                  👇 this is new!
storage_client = storage.Client(project='project_id',credentials=credentials)
bucket = storage_client.bucket(file_info["bucketName"])
blob = bucket.blob(file_info["objectName"])
contents = blob.download_as_bytes()

return contents

 

 

and it works now

View solution in original post

3 REPLIES 3
Top Labels in this Space
Top Solution Authors