Update(write-back) data on BigQuery from Looks?

Is there a way to let users update data on DB right from looks?
I read about actions here but I found it very abstract.
Can someone please explain the process for this?

Thanks,
Krish.

Solved Solved
3 32 12.7K
1 ACCEPTED SOLUTION

Hey Krishna,

I’ve been able to write data from Looker to BigQuery using both Data Actions as well as the Looker Action Hub. In either case, you’ll need to push data from Looker to some middleware that will interpret the webhook from Looker and perform the necessary operations to then stream the data to BigQuery.

Luckily, Google has a great service called Google Cloud Functions that makes this really easy. Like AWS’s Lambda, Cloud Functions let you deploy code that gets executed based off of some event. With a data action, you can push JSON containing data from Looker as well as user-defined form parameters to a Cloud Function endpoint. The Cloud Function then parses the JSON, extracts the relevant values, and calls on the BigQuery SDK to stream the results to BigQuery.

Here’s a quick overview of how to use Cloud Functions to stream data from Looker to BigQuery. In this example, we’ll create a data action and cloud function that lets an end user persist an annotation to BigQuery:

Create the Data Action

In this example, we’re going to attach a data action to field, and allow end-users to mark whether or not a name is a cool name.

 dimension: name {
    type: string
    sql: ${TABLE}.name ;;
    action: {
      label: "Cool Name?"
      url: ""

      param: {
        name: "name"
        value: "{{ value }}"
      }
      form_param: {
        name: "annotation"
        type: select
        label: "Cool name?"
        default: "No"
        description: "Do you think that this name is a cool name?"
        option: {
          name: "No"
        }
        option: {
          name: "Yes"
        }
      }
    }
  }

Note: We’re going to leave the url blank for now. Once we’ve spun up the cloud function we’ll paste the endpoint in.

Configure the Cloud Function

  1. Follow the first three steps here to select your GCP project and enable API access for Cloud Functions and BigQuery.
  2. Navigate to https://console.cloud.google.com/functions and ensure that you’ve selected the same project in which BigQuery resides.
  3. Click Create Function, and give it a name. Select a memory allocation (in my experience, you can select the minimum for this type of operation).
  4. Select HTTP as your trigger
  5. Select your preferred runtime (for this example, I will use Python 3.7, but versions of Node.js are also supported).

Creating your Cloud Function

We’re now going to write a simple Python function that writes the user selected annotation to BigQuery, and place it in main.py

import google.cloud.bigquery as bigquery
import datetime
import time

def annotation(request):
    r = request.get_json() # Fetch the data action JSON
    
    client = bigquery.Client()
    dataset_id = '' # Replace with name of the BQ dataset 
    table_id = ''  # replace with your table ID
    table_ref = client.dataset(dataset_id).table(table_id)
    table = client.get_table(table_ref)  # API request

    # request variables
    name = r['data']['name']
    annotation = r['form_params']['annotation']
    
    # system variables
    sys_time = int(time.time()) 
    
    row_to_insert = [
            (
             name, 
             annotation,
             datetime.datetime.fromtimestamp(sys_time).strftime('%Y-%m-%d %H:%M:%S')
            )
        ]
    row = client.insert_rows(table, row_to_insert)  # API request to insert row
    return '{"looker": {"success": true,"refresh_query": true}}' # return success response to Looker

Additional things to configure:

  • The ‘Function to Execute’ is annotation
  • Make sure to include a line for google-cloud-bigquery==1.5.0 in requirements.txt
  • Click the ‘Trigger’ tab and copy the URL. Paste this in the action that you setup in the first step.

The End Result:

Caveats:

  • Right now, this function is open to the internet, so anyone with your Cloud Function URL can write data to your BigQuery instance. Consider adding in some checks/validation that the request is coming from Looker.
  • For a more comprehensive and secure approach, consider adapting this using the Action Hub framework. I can provide more detail on this if you’re interested.

Cheers!
Jesse

View solution in original post

32 REPLIES 32
Top Labels in this Space
Top Solution Authors