Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Cloud function error Cannot read properties of undefined (reading 'jobCompletedEvent')

Hi there, 

I got an error when the Cloud Function executed. 

 

 

run_scheduled_query_ga4_daily_master TypeError: Cannot read properties of undefined (reading 'jobCompletedEvent')
    at exports.runScheduledQuery

 

 

Here's the cloud function i'm using: (from Simo's blog https://www.teamsimmer.com/2022/12/07/how-do-i-trigger-a-scheduled-query-when-the-ga4-daily-export-h...)

 

 

const bigqueryDataTransfer = require('@google-cloud/bigquery-data-transfer');

exports.runScheduledQuery = async (event, context) => {
  // Update configuration options
  const projectId = 'my_project_id';
  const configId = 'id from scheduled query';
  const region = 'eu';

  // Load the log data from the buffer
  const eventData = JSON.parse(Buffer.from(event.data, 'base64').toString());
  const destinationTableId = eventData.protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.tableId;

  // Grab the table date and turn it into the run time for the scheduled query
  const tableTime = destinationTableId.replace('events_', '');
  const year = tableTime.substring(0, 4),
        month = tableTime.substring(4, 6),
        day = tableTime.substring(6, 8);
  // Set the run time for the day after the table date so that the scheduled query works with "yesterday's" data
  const runTime = new Date(Date.UTC(year, month - 1, parseInt(day) + 1, 12));
  // Create a proto-buffer Timestamp object from this
  const requestedRunTime = bigqueryDataTransfer.protos.google.protobuf.Timestamp.fromObject({
    seconds: runTime / 1000,
    nanos: (runTime % 1000) * 1e6
  });

  const client = new bigqueryDataTransfer.v1.DataTransferServiceClient();
  const parent = client.projectLocationTransferConfigPath(projectId, region, configId);

  const request = {
    parent,
    requestedRunTime
  };

  const response = await client.startManualTransferRuns(request);
  return response;
};

 

 

I'm using log query in my log sink

 

 

resource.type="bigquery_dataset"
resource.labels.dataset_id="analytics_123456789"
resource.labels.project_id="my_project_name"
protoPayload.metadata.tableCreation.reason="JOB"
protoPayload.serviceName="bigquery.googleapis.com"
protoPayload.methodName="google.cloud.bigquery.v2.JobService.InsertJob"
protoPayload.authenticationInfo.principalEmail="firebase-measurement@system.gserviceaccount.com" 
protoPayload.resourceName:"tables/events_"
NOT
protoPayload.resourceName:"tables/events_intraday"

 

 

Simo's blog is using 

 

 

protoPayload.methodName="jobservice.jobcompleted" 
protoPayload.authenticationInfo.principalEmail="firebase-measurement@system.gserviceaccount.com" 
protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.datasetId="my_project_id.dataset_id" 
protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.tableId=~"^events_\d+"

 

 

 I'm not expert in this area, so I asked for help from Duet AI and Duet AI suggested me to add 

 

 

const protoPayload = {
  methodName: "google.cloud.bigquery.v2.JobService.InsertJob",
  jobCompletedEvent: {
    job: {
      jobConfiguration: {
        load: {
          destinationTable: {
            datasetId: "my_project_id.dataset_id",
            tableId: "~^events_\d+"
          }
        }
      }
    }
  }
};

 

 

 but not sure where do use it in the cloud function and how?  

Solved Solved
0 8 6,849
1 ACCEPTED SOLUTION

The error you're encountering, TypeError: Cannot read properties of undefined (reading 'jobCompletedEvent'), arises because the object eventData.protoPayload.serviceData.jobCompletedEvent is undefined at the time the Cloud Function is trying to access its properties. This could be due to differences in the log entry structure between your setup and Simo's setup.

You could follow Duet AI's suggestion by creating a protoPayload object. This would involve declaring the protoPayload object at the start of your Cloud Function and then modifying the line:

 

const destinationTableId = eventData.protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.tableId;

 

const destinationTableId = protoPayload.jobCompletedEvent.job.jobConfiguration.load.destinationTable.tableId;

In addition to the above, you may also want to consider modifying your log query to ensure that it is matching the logs that are being generated by your BigQuery job. The log query that you are currently using is:

 

resource.type="bigquery_dataset"
resource.labels.dataset_id="analytics_123456789"
resource.labels.project_id="my_project_name"
protoPayload.methodName="JOB"
protoPayload.serviceName="bigquery.googleapis.com"
protoPayload.methodName="google.cloud.bigquery.v2.JobService.InsertJob"
protoPayload.authenticationInfo.principalEmail="firebase-measurement@system.gserviceaccount.com" 
protoPayload.resourceName:"tables/events_"
NOT
protoPayload.resourceName:"tables/events_intraday"

This log query will match any log that has the following properties:

  • The resource type is bigquery_dataset.
  • The dataset ID is analytics_123456789.
  • The project ID is my_project_name.
  • The method name is JOB.
  • The service name is bigquery.googleapis.com.
  • The method name is google.cloud.bigquery.v2.JobService.InsertJob.
  • The authentication information principal email is firebase-measurement@system.gserviceaccount.com.
  • The resource name starts with tables/events_ and does not end with events_intraday.

If your BigQuery job is not generating logs with these properties, then your log query will not match any logs and the eventData.protoPayload object will be undefined.

To troubleshoot this, you can use the BigQuery logging API to retrieve the logs for your BigQuery job. Once you have the logs, you can use them to determine the correct properties to use in your log query.

View solution in original post

8 REPLIES 8