I want a way to grant access between all datasets in a single GCP project by default. When a new dataset is created I want access to all datasets to be granted by default without having to update the metadata on all datasets each time a new dataset is created.
There doesn't appear to be a way to do this and if I'm creating a multiple new datasets in a single project then I have to retrospectively update all datasets in that project. Has anyone attempted to tackle this issue yet?
Out of curiosity, why do you want datasets to be authorized to each other in your project by default?
In our use case, we've seen there's security implication with this, so we only authorized what's needed. You can create a script or a stored procedure that will do the dataset authorizations the way you want it every time you create a new dataset.
What do you mean by "... grant access between all datasets in a single GCP project by default". I am imagining that you have a user or group and you want those users/groups to be able to access the tables contained in all the datasets in the project. I am imagining that you don't want to create a dataset and THEN explicitly authorize those users/groups on that dataset ... is this the correct thinking? If yes, then you would set the BigQuery roles to the users/group at the GCP project level rather than the individual dataset level. Setting IAM roles at the project level causes inheritance of those permission sets when access is requested on any dataset within the project.
The use case I have is that, when I create multiple datasets belonging to a single GCP Project, I have to create "Authorise Dataset" access to grant access for a view in one dataset to be able to query another view or table from a different dataset. For this particular scenario I'm not interested in person, groups and service accounts having permissions to execute or query views or tables, I handle those permissions at a table level.