GCP Audit Logs Integration Guide

Set up the GCP Audit Logs connector.

Overview

Radiant Security leverages Google Cloud's Audit Logs and BigQuery to monitor and analyze activities within Google Cloud environments. Audit Logs capture detailed records of actions taken across your Google Cloud projects, while BigQuery allows for powerful querying and analysis of these logs.

In this guide you'll complete the following steps:

Important note: GCP Audit Logs record all access events made by users and services within your environment. This may result in unexpected storage costs for both the logging service and BigQuery Storage. To understand the potential costs, we recommend reviewing the pricing guide for Logging. We also recommend assessing the pricing for heavily used resources in your environment, such as BigTable. It’s also a good practice to monitor your billing forecast and Logging Storage usage which can be found under Monitoring > Logs Storage.

Prerequisites

Before you begin, ensure that you have the following permissions:

  • Owner or Editor role on your Google Cloud organization.
  • IAM Admin role to create and manage service accounts.
  • Logging Admin role to configure audit logs and log sinks.
  • BigQuery Admin role to create datasets and manage BigQuery resources.

Enable audit logs

As previously noted, some services may generate high volumes of logs, potentially increasing your billing costs. We recommend enabling logging for all services by following steps 4 and 5. If you later find that specific services are generating excessive logs, you can disable logging for them. To disable a specific log, follow step 6.
  1. Access the Google Cloud console.
  2. Select your organization as the scope on the top part of the page.
  3. From the left side menu, navigate to IAM & Admin > Audit Logs.
  4. Scroll to the end of the page and select 200 as the number of rows per page. This will activate all logs at once:Untitled
  5. Click on the first checkbox to select all services. In the window that appears, under Permissions Type, select Admin Read, Data Read and Data Write. Click Save.Untitled(1)
  6. (Optional.) To disable logs for selected services:
    1. Search for and select the service that you want disable. For example, BigTable.
    2. In the window that appears, under Permissions Types, ensure that all the log types are unselected. Click Save.Untitled(1)-1

       

Create a service account

If you have already created a service account for Security Command Center (SCC) connector, you might be able to reuse it. If that isn’t the case, select a project to which the service account will belong. The only constraint is that the service account needs to be in the same project as the BigQuery dataset that we will create.

  1. Go to IAM & Admin > Service Accounts.
  2. Click Create Service Account.Untitled-1
  3. Under Service account name, enter radiant-audit-logs-connector.
  4. Under Service account description, enter a descriptive name for easy identification later.Untitled(2)
  5. Copy the email address generated and save it for later use.
  6. Click Create and Continue.
  7. In the Role section, add the following roles:
    • Log Viewer
    • BigQuery AdminUntitled-2
  8. Click Continue and then Done.

Create service account keys

  1. Click on the newly created service account to open its details.
  2. Click the Keys tab.
  3. Click Add Key and choose Create New Key.Untitled(1)-2
  4. Select JSON and click Create.
  5. The JSON file will download automatically. Securely store it as it contains the service account key.

Create a BigQuery dataset

  1. From the left side menu, navigate to BigQuery Console.
  2. In the left panel, expand the menu next to your project and click Create Dataset.Untitled-3
  3. Under Dataset ID, enter radiant_connector.
  4. Under Default maximum table age, set it to 30 Days.Untitled(1)-3
  5. Click Create Dataset.

Note: Double check the spelling of the Dataset ID as it’s needed to locate the dataset in a later step.

Create a log sink

Important note: For this step, ensure that you are in the organization scope again.

1.   Go to Logging > Log Router.

 
2.   Click Create sink.
3.   Under Sink name, enter radiant_audit_logs.
4.   Under Sink description, enter a descriptive name for easy identification later.
5.   Click Next.

Untitled(2)-1

6.   Under Select sink service, select BigQuery.

7.   For Sink destination, select Use a BigQuery dataset in a project.

GCP will automatically fill the Sink destination as: bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET_ID]

Replace PROJECT_ID with the ID of the project where you created the dataset.

Replace the DATASET_ID with the value created earlier: radiant_connector

8.   Leave Use partitioned tables as unselected and click Next. Untitled(3)
9.   Select Include logs ingested by this organization and all child resources. Leave the Build inclusion filter section empty and click Next. Untitled-4
10.   Leave the Build exclusion filter empty and click Create Sink.

Note: If you get the error Permission Denied, make sure you have the role Logging Admin at the Organization Level. This might happen even if you have the role Organization Admin.

Verify incoming logs in BigQuery

  1. Go back to the BigQuery Console.
  2. Navigate to the dataset you created.
  3. Check for incoming logs to ensure the setup is working correctly.

 

We value your opinion. Did you find this article helpful? Share your thoughts by clicking here or reach to our Product and Customer Success teams at support@radiantsecurity.ai 

 

Last updated: 2024-08-23