# GCP Audit Logs

In this guide, you'll configure the integration between Radiant and Google Cloud using Audit Logs and BigQuery. Audit Logs capture detailed records of activity across your Google Cloud projects, while BigQuery enables fast and flexible analysis of those logs for security monitoring and investigation.

{% hint style="warning" %}
**Important note:** GCP Audit Logs record all access events made by users and services within your environment. This may result in unexpected storage costs for both the logging service and BigQuery Storage. To understand the potential costs, we recommend reviewing the [pricing guide for Logging](https://cloud.google.com/stackdriver/pricing). We also recommend assessing the pricing for heavily used resources in your environment, such as [BigTable](https://cloud.google.com/bigtable/docs/audit-log-estimate-costs). It’s also a good practice to monitor your billing forecast and Logging Storage usage which can be found under **Monitoring > Logs Storage.**
{% endhint %}

### Prerequisites

Before you begin, ensure that you have the following permissions:

* [ ] Owner or Editor role on your Google Cloud organization
* [ ] IAM Admin role to create and manage service accounts
* [ ] Logging Admin role to configure audit logs and log sinks
* [ ] BigQuery Admin role to create datasets and manage BigQuery resources

### Enable audit logs

As previously noted, some services may generate high volumes of logs, potentially increasing your billing costs. We recommend enabling logging for all services by following **steps 4** and **5**. If you later find that specific services are generating excessive logs, you can disable logging for them. To disable a specific log, follow **step 6**.

1. Access the Google Cloud console.
2. Select your Organization as the scope on the top part of the page.
3. From the left side menu, navigate to **IAM & Admin > Audit Logs**.
4. Scroll to the end of the page and select **200** as the number of rows per page. This will activate all logs at once:

<figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FPaDOLOL1M7O2RhV5FBMh%2FGCP_Audit_Logs_01.png?alt=media&#x26;token=c2aa57a2-55a1-46cd-857b-49b499f64557" alt=""><figcaption></figcaption></figure>

5. Click on the first checkbox to select all services. In the window that appears, under **Permission** **Type**, select **Admin Read**, **Data Read** and **Data Write.** Click **Save**.

<figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2Fg6OdzT3EYPIYiNcvKsch%2FGCP_Audit_Logs_02.png?alt=media&#x26;token=91666acc-028c-45b3-8f69-c58f429fc412" alt=""><figcaption></figcaption></figure>

6. (Optional) To disable logs for selected services:
   * Search for and select the service that you want disable. For example, BigTable.
   * In the window that appears, under **Permission** **Types**, ensure that all the log types are *unselected*. Click **Save**.

<figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FUpgZmhxCuLlHSv2m8LtP%2FGCP_Audit_Logs_03.png?alt=media&#x26;token=0f5ceba3-4065-47a1-b4ea-3dff9045fecc" alt=""><figcaption></figcaption></figure>

### Create a service account

If you have already created a service account for Security Command Center (SCC) connector, you might be able to reuse it. If that isn’t the case, select a project to which the service account will belong. The only constraint is that the service account needs to be in the same project as the BigQuery dataset that we will create.

1. Go to **IAM & Admin > Service Accounts**.
2. Click **Create Service Account**.

<figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FkgyR8J0mh5iL5a7VJ9VG%2FGCP_Audit_Logs_04.png?alt=media&#x26;token=2fdba75d-7c2d-44a4-bb43-f978acde1b6b" alt=""><figcaption></figcaption></figure>

3. Under **Service** **account** **name**, enter `radiant-audit-logs-connector`.
4. Under **Service** **account** **description**, enter a descriptive name for easy identification later.

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2Fsp8jfoB3G8pR5JLG1Tbq%2FGCP_Audit_Logs_05.png?alt=media&#x26;token=115be514-77a2-4385-885c-14d4266c5507" alt="" width="563"><figcaption></figcaption></figure></div>

5. Copy the email address generated and save it for later use.
6. Click **Create and Continue**.
7. In the **Role** section, add the following roles:
   * **Log Viewer**
   * **BigQuery Admin**

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FOEkpOQeeNdoxj9weg7h6%2FGCP_Audit_Logs_06.png?alt=media&#x26;token=6734e33a-ae83-4133-a84a-d3fcec78bb59" alt="" width="563"><figcaption></figcaption></figure></div>

8. Click **Continue** and then **Done**.

### Create service account keys

1. Click on the newly created service account to open its details.
2. Click the **Keys** tab.
3. Click **Add Key** and choose **Create New Key**.

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FhEPFut0ywFu3dI2xmKCT%2FGCP_Audit_Logs_07.png?alt=media&#x26;token=e85a2c8e-c625-49b4-9413-0b28bc41d746" alt="" width="563"><figcaption></figcaption></figure></div>

4. Select **JSON** and click **Create**.
5. The JSON file will download automatically. Securely store it as it contains the service account key.

### Create a BigQuery Dataset

1. From the left side menu, navigate to **BigQuery Console**.
2. In the left panel, expand the menu next to your project and click **Create dataset**.

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FxoednIWcp4oXf1rTK47d%2FGCP_Audit_Logs_08.png?alt=media&#x26;token=52a145eb-6791-4662-bb54-c1e4741c78f2" alt="" width="563"><figcaption></figcaption></figure></div>

3. Under **Dataset** **ID**, enter `radiant_connector`.
4. Under **Default** **maximum** **table** **age**, set it to **30** **Days**.

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FQYsxCd5KkyXySAOskPbL%2FGCP_Audit_Logs_09.png?alt=media&#x26;token=18ecbea5-ac3c-43e6-86ae-c653fc0bace9" alt="" width="563"><figcaption></figcaption></figure></div>

5. Click **Create Dataset**.

{% hint style="info" %}
**Note**: Double check the spelling of the **Dataset** **ID** as it’s needed to locate the dataset in a later step.
{% endhint %}

### Create a log sink

{% hint style="warning" %}
**Important** **note**: For this step, ensure that you are in the organization scope again.
{% endhint %}

1. Go to **Logging >** **Log Router**.
2. Click **Create sink**.
3. Under **Sink** **name**, enter `radiant_audit_logs` as a name.
4. Under **Sink** **description**, enter a descriptive name for easy identification later.
5. Click **Next**.

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FvfcoW6QpYq5Bic8vULhc%2FGCP_Audit_Logs_10.png?alt=media&#x26;token=52cdf7bf-3a51-47b1-9f7d-20968ea136e9" alt="" width="563"><figcaption></figcaption></figure></div>

6. Under **Select sink service**, select **BigQuery.**
7. For **Sink** **destination**, select **Use a BigQuery dataset in a project.**

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2F3HgdU3isbtMJ1jo892qD%2FGCP_Audit_Logs_11.png?alt=media&#x26;token=46bf2571-ab33-4f63-926c-ef1bbdb9800d" alt="" width="563"><figcaption></figcaption></figure></div>

GCP will automatically fill the **Sink** **destination** as: `bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET_ID]`

* Replace **PROJECT\_ID** with the ID of the project where you created the dataset.
* Replace the **DATASET\_ID** with the value created earlier: `radiant_connector`

8. Leave **Use partitioned tables** as unselected and click **Next**.
9. Select **Include logs ingested by this organization and all child resources.** Leave the **Build inclusion** **filter** section empty and click **Next.**

<div align="left"><figure><img src="https://2439665791-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPsFulb2ZOtSPcRSc2rXE%2Fuploads%2FLGCM6DDcCPAjdqDiVuCY%2FGCP_Audit_Logs_12.png?alt=media&#x26;token=502d60ad-a900-4613-b88f-502ae086ecfc" alt="" width="563"><figcaption></figcaption></figure></div>

10. Leave the **Build exclusion** **filter** empty and click **Create Sink**.

{% hint style="info" %}
**Note:** If you get the error **Permission Denied,** make sure you have the role **Logging Admin** at the Organization Level. This might happen even if you have the role **Organization Admin.**
{% endhint %}

### Verify incoming Logs in BigQuery

1. Go back to the **BigQuery Console**.
2. Navigate to the dataset you created.
3. Check for incoming logs to ensure the setup is working correctly.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://help.radiantsecurity.ai/radiant-connectors/data-connectors/gcp-audit-logs.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
