In GCP we are having a VM and we are attached a SA to it and we want to access that in multiple cross project resources. In google cloud do we have any limit for cross-projects? In GCP documents i didn't find any limit for that.
Our requirement is to access BigQuery resources in 1000+ GCP projects.
Hello ik009,
There was a similar concern posted in Stackoverflow, and as per John Hanley who commented on the post "Google removed the 600 refresh token limit document so I don't know about that point. However, you are misunderstanding what a service account is and how it is used. It is used to issue API calls. This means that the access token would be generated and used for that API call. The API call would be to a service. That service would be in one project. The token would not be authenticating against 1K+ projects, only with one service at that point in time. Access tokens are valid for one hour (changeable) so I am not sure what caching may be done, but the access token can be reused over and over. "
Hi Dionv,
Thanks for the explanation, but my question is little different.
We are trying to design an architecture in GCP where we want to get bigquery data from multiple projects(700-1000) to process that data in compute engine.
For that we are ready to create a service account and assign that to our compute engine and provide access to all (700-1000)bigquery projects, but before designing that we want to get clarity on gcp project limit for that.
Is there any limit on GCP cross projects access ?
Thanks