Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Creating a Sink for BigQuery Audit Logs Across Organizations or Projects

Hello!

I'm currently working on setting up a sink for BigQuery Audit Logs, and I'm wondering if it's possible to configure a single sink to collect audit logs from either entire organizations or multiple projects into a single BigQuery table. Is there a way to achieve this, or do I need to create separate sinks for each project individually? I appreciate your assistance in addressing my query!

 

Solved Solved
0 10 10.2K
2 ACCEPTED SOLUTIONS

Hi @ylee1212 ,

Aggregated Log Sink Definition: An aggregated log sink is a sink that collects logs from multiple resources, such as projects, folders, or an entire organization. It is created at a higher level (organization or folder) with the --include-children flag (if using gcloud).

Console Steps: To create an aggregated log sink to collect BigQuery audit logs from multiple projects or an entire organization in the Google Cloud Console, follow these steps:

  1. Go to the Google Cloud console: https://console.cloud.google.com/.

  2. Click the hamburger menu (three horizontal lines) in the top left corner of the page.

  3. Select Logging.

  4. Click the Sinks tab.

  5. Click the Create sink button.

  6. In the Create sink dialog box, enter the following information:

    • Sink name: Enter a name for your sink.
    • Destination: Select BigQuery table.
    • Project: Select the project where your BigQuery table is located.
    • Dataset: Select the dataset where your BigQuery table is located.
    • Table: Select the BigQuery table where you want to collect your audit logs.
    • Parent: Select the organization or folder where you want to collect audit logs from.
  7. Under Filter, select Create a new filter.

  8. In the Filter expression field, enter the following filter expression to collect all BigQuery audit logs:

resource.type="bigquery_resource" AND protoPayload.serviceName="bigquery.googleapis.com"
  1. Click the Create sink button.

Service Account Permissions: After you have created the sink, you will need to grant the sink service account permission to write to the BigQuery table. To do this, follow these steps:

  1. Go to the IAM & Admin page in the Google Cloud console.
  2. Click the Service accounts tab.
  3. Click the name of the sink service account.
  4. Click the Permissions tab.
  5. Click Add.
  6. In the Search roles field, enter BigQuery Data Writer.
  7. Select the BigQuery Data Writer role.
  8. Click Add.
  9. Click Save.

Once you have granted the sink service account permission to write to the BigQuery table, it will start collecting audit logs. You can then query the BigQuery table to view your audit logs.

View solution in original post

Hi @ylee1212 ,

Thank you for the additional information. The presence or absence of the protopayload_auditlog.metadataJson field in audit logs should not inherently differ based on whether the logs are collected at the project or organization level. Both levels should capture the same fields for the same types of events.

If you're observing a discrepancy, it might be due to the specific types of logs being captured, the filters applied, or other configuration differences between the two sinks. The metadataJson field provides additional metadata for certain types of audit log entries, particularly data access logs.

To troubleshoot, ensure both sinks are capturing the same types of logs and have consistent configurations. 

View solution in original post

10 REPLIES 10