We are developing several Cloud Functions for our work and this requires the cloud function to save the file to GCS, rend response from data from Big Query, Insert data into Big Query Table etc.
This means, the cloud function should be able to access GCS and BQ services.
At this point, we are adding the service account key JSON to the Cloud Function routine as you see below:
And accessing the JSON in the cloud function main.py:
The problem with above approach is, I have to do that in every cloud function. If the JSON changes, I need to change it in every cloud function.
I tried to find a way of using "" but I was not successful as I could not find how and where I should set this up for use "GOOGLE_APPLICATION_CREDENTIALS". Can I use this so that I set the credentials in one place and all the functions can use it?
I even thought of having this JSON in a GCS bucket and refer in every place but I was unsuccessful there too.
Any thoughts on this?
Thanks
Solved! Go to Solution.
Thanks @marcanthonyb the Link provided by you gave the hints.
What I have done is, I created a Service Account and gave role access to the service account as needed. For example, the below roles were assigned to the service account.
roles/bigquery.dataEditor
roles/storage.admin
I used the service account as run time service account of the cloud function.
By doing this, I was able to connect to GCS and BQ without the use of JSON by simply stating the below in the Function Code: