Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Authentication Between Cloud Function and Other GCP Services

We are developing several Cloud Functions for our work and this requires the cloud function to save the file to GCS, rend response from data from Big Query, Insert data into Big Query Table etc.

This means, the cloud function should be able to access GCS and BQ services. 

At this point, we are adding the service account key JSON to the Cloud Function routine as you see below:

vbgcp_0-1679540687512.png

And accessing the JSON in the cloud function main.py:

vbgcp_1-1679540724298.png

The problem with above approach is, I have to do that in every cloud function. If the JSON changes, I need to change it in every cloud function.

I tried to find a way of using "" but I was not successful as I could not find how and where I should set this up for use "GOOGLE_APPLICATION_CREDENTIALS". Can I use this so that I set the credentials in one place and all the functions can use it?

I even thought of having this JSON in a GCS bucket and refer in every place but I was unsuccessful there too.

Any thoughts on this?

Thanks

Solved Solved
0 2 1,526
1 ACCEPTED SOLUTION

Hi @vbgcp ,

I would suggest looking into this thread which can help you pretty much on your issue. Let me know if there's any clarification. Thank you.

Regards,
Marc

View solution in original post

2 REPLIES 2

Hi @vbgcp ,

I would suggest looking into this thread which can help you pretty much on your issue. Let me know if there's any clarification. Thank you.

Regards,
Marc

Thanks @marcanthonyb the Link provided by you gave the hints. 

What I have done is, I created a Service Account and gave role access to the service account as needed. For example, the below roles were assigned to the service account.

roles/bigquery.dataEditor
roles/storage.admin

I used the service account as run time service account of the cloud function.

vbgcp_0-1679735955311.png

By doing this, I was able to connect to GCS and BQ without the use of JSON by simply stating the below in the Function Code:

    credentials, project_id = google.auth.default()
    client = storage.Client(credentials=credentials, project=project_id)
   
    client = bigquery.Client()
 
Using this mechanism, I did not have to use the JSON and was able to write to GCS and insert to a BQ dataset table.
 
Thanks