Looker Studio Pro in a large-scale production environment

Has anyone using LSP on large scale (multiple departments, User etc). if yes could you please share how did you implement below key items or any best practice document will be helpful. we are trying to use LSP as enterprise BI and appreciate your inputs.

1) CI/ CD pipeline

2) Cache management

3) Dashboard level permission

4) Data source - Connecting to BQ service accounts

& more

0 4 328
4 REPLIES 4

@Pop_venkat 

We have created quite a few of similar projects for 50-100 users.

However, please be aware that what you are asking is an consulting area for at least 5k -10k USD especially when dealing on enterprise level.
Please do not expect someone will give away you an document outlining best practices for free on forum.



Thanks for the response.  I am looking some best practice from looker/google. We are looking to on board more than 5k users from multiple departments. if not all could you share high level inputs of managing CI/CD like Dev >QA> Prod.  I see this is directly available in other BI tools like powerBI, Qlik, tableau etc 

@Pop_venkat 
In this case you should contact your GCP reseller or GCP division in your country as the scope is way above the looker studio community knowledge.

We are using it in a similar situation.

1) CI/ CD pipeline > LSP is not yet mature enough to be integrated with CI/CD so changes in production would happen through the publish feature. Yes, not ideal. We use dataform+BQ integrated to our CICD for preparing the datamarts feeding LSP.  

2) Cache management > We have a parameter set in the datasource as "cache_hash" which is then used in a Custom query in the BigQuery connector. LSP is integrated in an iframe in our internal app, everytime the data source changes, this hash is updated and injected as a query string parameter, which forces LSP to refresh the data. Otherwise with BQ you can set the cache to as low as 1min which is enough most of the time. 

3) Dashboard level permission > For access control to the data we have an attribute in the source table with the list of authorised users for each line, then we compare it with the @ds_USER_EMAIL parameter in a custom query in the BigQuery connector. For access to the dashboard itself, this is managed using Google SSO (users are on workspace) and Google Groups but this isn't really necessary since if they don't have access to the data they would see nothing in the dashboards anyway.

4) Data source - Connecting to BQ service accounts > natively supported just follow the docs