Announcements
This site is in read only until July 22 as we migrate to a new platform; refer to this community post for more details.

Long run times when running a Continuous Integration suite

I am currently exploring the new continuous integration feature in Looker. I have created a suite with the following settings:

  • Validators = SQL, Assert, Content, LookML
  • Ignore hidden fields = yes
  • Query concurrency = 10 (default)
  • Incremental validation = yes

When I run it on my dev branch, it takes 23-26 minutes to run. Surprisingly, this is no faster than when I run it for production.

I had hoped that I could introduce the following workflow:

  1. developer makes changes to the code
  2. developer commits code
  3. developer runs CI suite to identify any errors (ie SQL errors)
  4. Are there errors?
    • no = deploy to production
    • yes = fix errors (go back to step 1)

However, I don't think this is feasible for every singly commit if the CI suite is going to take nearly half an hour to run. Instead, we will probably run it, say, once a week as a separate debugging exercise.

Any tips on how to speed this up? Is any one else experiencing long run times? I am going to do an audit of unused Explores / fields, but I didn't think we had a particularly large instance.

Solved Solved
0 2 79
1 ACCEPTED SOLUTION

Hey @ruthc - great you started to explore that new functionality! Curious to see how it works and what pitfalls it has. Currently, we're on the enterprise Looker update schedule, so have couple of months to prepare 🙂

To your question "Any tips on how to speed this up?" I would split all the validators by the time. Do you really care about "Assert, Content" validations each commit?

We have "Assert" validation just before Production deployment as it's rarely an issue. And we don't have "Content" validation at all as in our case we don't care about users content...

Hope makes sense. And looking forward for other ideas!

View solution in original post

2 REPLIES 2

Hey @ruthc - great you started to explore that new functionality! Curious to see how it works and what pitfalls it has. Currently, we're on the enterprise Looker update schedule, so have couple of months to prepare 🙂

To your question "Any tips on how to speed this up?" I would split all the validators by the time. Do you really care about "Assert, Content" validations each commit?

We have "Assert" validation just before Production deployment as it's rarely an issue. And we don't have "Content" validation at all as in our case we don't care about users content...

Hope makes sense. And looking forward for other ideas!

Thanks for the suggestions. That is a good idea about splitting the tests into multiple suites. We do care about the Content Validator as we use LookML dashboards, but I have set it to ignore personal folders (forgot to mention that).

I removed the Assert Validator from the suite and it ran in 11 minutes. I might also create individual suites per model (since a developer will only be working on one model at a time when making code changes).

My general experience so far is that the SQL Validator adds a lot of value (since we already have the ability to perform the other tests). For example, we realised that 1) our Looker Service Account needed to be given access to a datasource, 2) a table had disappeared from our SQL database c) a bunch of lesser errors around typos / errors in the LookML field definition, or upstream schema changes like database column names being changed. These would only have been picked up in the front-end, if we / a user had queried those specific fields and got an error.