arrow_back

Analyze audit logs using BigQuery

ログイン 参加
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

Analyze audit logs using BigQuery

Lab 1時間 30分 universal_currency_alt クレジット: 2 show_chart 入門
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses
important icon IMPORTANT:

desktop/labtop icon Make sure to complete this hands-on lab on a desktop/laptop only.

check icon There are only 5 attempts permitted per lab.

quiz target icon As a reminder – it is common to not get every question correct on your first try, and even to need to redo a task; this is part of the learning process.

timer icon Once a lab is started, the timer cannot be paused. After 1 hour and 30 minutes, the lab will end and you’ll need to start again.

tip icon For more information review the Lab technical tips reading.

Activity overview

Google Cloud services write audit logs that record administrative activities and access within your Google Cloud resources. Audit log entries help you answer the questions "who did what, where, and when" within your Google Cloud projects. Enabling audit logs helps your security, auditing, and compliance entities monitor Google Cloud data and systems for possible vulnerabilities or external data misuse.

In this lab, you'll investigate audit logs to identify patterns of suspicious activity involving cloud resources.

Scenario

Cymbal Bank has officially migrated to its hybrid cloud solution and successfully deployed its workflows on the new cloud environment. Unfortunately, the Security Engineering team has been notified of a high severity alert involving unauthorized access to several of its cloud resources. This is alarming since malicious actors can use compromised cloud resources to exfiltrate data and launch attacks on other systems. It is your first time experiencing a security incident. Your team lead, Chloe, recognizes this as a valuable opportunity for you to learn the processes and procedures involved with incident response. You've been assigned to shadow and observe Hannah, an incident responder on the Incident Response Team which is a unit of the Security Engineering department. Hannah has provided you with access to the alert's logs which you'll use to investigate the malicious activity. You want to get a better understanding of the security incident so you have set up a test environment to recreate the incident and analyze the artifacts. You will use two separate user accounts: one account will generate the malicious activity, and the other account will be used to investigate the activity.

Here's how you'll do this task. First, you'll recreate the security incident by generating activity from the first user account. Next, you'll export the logs for further analysis. Then, you'll continue recreating the incident and generate additional user activity. Finally, you'll utilize BigQuery to analyze the logs.

Setup

Before you click Start Lab

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This practical lab lets you do the activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. On the left is the Lab Details panel with the following:

    • Time remaining
    • The Open Google Cloud console button
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
    Note: If you need to pay for the lab, a pop-up opens for you to select your payment method.
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window) if you are running the Chrome browser. The Sign in page opens in a new browser tab.

    Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between them.

    Note: If the Choose an account dialog displays, click Use Another Account.
  3. If necessary, copy the Google Cloud username below and paste it into the Sign in dialog. Click Next.

{{{user_0.username | "Google Cloud username"}}}

You can also find the Google Cloud username in the Lab Details panel.

  1. Copy the Google Cloud password below and paste it into the Welcome dialog. Click Next.
{{{user_0.password | "Google Cloud password"}}}

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  1. Click through the subsequent pages:
    • Accept the terms and conditions
    • Do not add recovery options or two-factor authentication (because this is a temporary account)
    • Do not sign up for free trials

After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking the Navigation menu at the top-left. Google Cloud console menu with the Navigation menu icon highlighted

Activate Cloud Shell

Cloud Shell is an online development and operations environment accessible anywhere with your browser. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell (Activate Cloud Shell icon) at the top right of the Google Cloud console. You may be asked to click Continue.

After Cloud Shell starts up, you'll see a message displaying your Google Cloud Project ID for this session:

Your Cloud Platform project in this session is set to YOUR_PROJECT_ID

The command-line tool for Google Cloud, gcloud,comes pre-installed on Cloud Shell and supports tab-completion. In order to access Google Cloud, you'll have to first authorize gcloud.

  1. List the active account name with this command:
gcloud auth list
  1. A pop-up will appear asking you to Authorize Cloud Shell. Click Authorize.

  2. Your output should now look like this:

Output:

ACTIVE: * ACCOUNT: student-01-xxxxxxxxxxxx@qwiklabs.net To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. List the project ID with this command:
gcloud config list project

Example output:

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Task 1. Generate account activity

Note: Make sure you are on the username 1: Google Cloud console.

In this task, you'll create and delete cloud resources to generate account activity which you'll access as Cloud Audit Logs.

  1. Copy the following commands into the Cloud Shell terminal:
gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID echo "this is a sample file" > sample.txt gcloud storage cp sample.txt gs://$DEVSHELL_PROJECT_ID gcloud compute networks create mynetwork --subnet-mode=auto export ZONE=$(gcloud compute project-info describe \ --format="value(commonInstanceMetadata.items[google-compute-default-zone])") gcloud compute instances create default-us-vm \ --machine-type=e2-micro \ --zone=$ZONE --network=mynetwork gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID
  1. Press ENTER.

Click Check my progress to verify that you have completed this task correctly. Generate account activity

Task 2. Export the audit logs

Note: Make sure you are on the username 1: Google Cloud console.

The activity you generated in the previous task was recorded as audit logs. In this task you'll export these logs to a BigQuery dataset for further analysis.

  1. In the Google Cloud console, in the Navigation menu (navigation menu icon) click Logging > Logs Explorer. The Logs Explorer page opens. (You may need to click More Products to expand the Navigation menu options and locate Logging under Operations.)
  2. When exporting logs, the current filter will be applied to what is exported. Copy the following query into the Query builder:
logName = ("projects/{{{project_0.project_id | Project ID}}}/logs/cloudaudit.googleapis.com%2Factivity")
  1. Click Run query. The query results should display on the Query results pane. This query filters for Cloud Audit logs within your project.
  2. Under the Query editor field, click More actions > Create sink. The Create logs routing sink dialog opens.
Note: If your browser window is narrow, the UI may display More instead of More actions.
  1. In the Create logs routing sink dialog, specify the following settings and leave all other settings at their defaults:
Section Field: values
Sink details Sink name: AuditLogsExport
Click Next.
Sink destination Select sink service: BigQuery dataset
Select BigQuery dataset: Create new BigQuery dataset.
The Create dataset dialog opens.
Create dataset Dataset ID: auditlogs_dataset
Click Create Dataset.
The Create dataset dialog closes, and you'll return to the Sink destination dialog.
Sink destination Click Next.
Uncheck the Use Partitioned Tables checkbox, if it is already selected, and click Next.
Choose logs to include in sink Notice the pre-filled Build inclusion filter:logName=("projects/[PROJECT ID]/logs/cloudaudit.googleapis.com%2Factivity")
Click Next.
Click Create Sink.
Return to the Logs Explorer page.
  1. In the Logging navigation pane, click Log Router to view the AuditLogsExport sink in the Log Router Sinks list.
  2. Inline with the AuditLogsExport sink, click More actions (More icon) > View sink details to view information about the AuditLogsExport sink you created. The Sink details dialog opens.
  3. Click Cancel to close the Sink details dialog when you're done viewing the sink information.

All future logs will now be exported to BigQuery, and the BigQuery tools can be used to perform analysis on the audit log data. The export does not export existing log entries.

Click Check my progress to verify that you have completed this task correctly. Export the audit logs

Task 3. Generate more account activity

Note: Make sure you are on the username 1: Google Cloud console.

In this task, you'll create and delete cloud resources to generate additional account activity which you'll then access in BigQuery to extract additional insights from the logs.

  1. Copy the following commands into the Cloud Shell terminal:
gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID-test echo "this is another sample file" > sample2.txt gcloud storage cp sample.txt gs://$DEVSHELL_PROJECT_ID-test export ZONE=$(gcloud compute project-info describe \ --format="value(commonInstanceMetadata.items[google-compute-default-zone])") gcloud compute instances delete --zone=$ZONE \ --delete-disks=all default-us-vm

These commands generate more activity to view in the audit logs exported to BigQuery.

  1. Press ENTER.

When prompted, enter Y, and press ENTER. Notice you created two buckets and deleted a Compute Engine instance.

  1. When the prompt appears after a few minutes, continue by entering the following commands into the Cloud Shell terminal:
gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID-test
  1. Press ENTER.

Notice you deleted both buckets.

Click Check my progress to verify that you have completed this task correctly. Generate more account activity

Task 4. Sign in as the second user

You'll need to switch Google Cloud accounts by logging into the Google Cloud console using the second user account provided in the Lab Details panel. You will use this user account to analyze the logs.

  1. In the Google Cloud console, click on the user icon in the top-right corner of the screen, and then click Add account.
  2. Navigate back to the Lab Details panel, copy the Google Cloud username 2: and password. Then, paste the username and password into the Google Cloud console Sign in dialog.

Task 5. Analyze the Admin Activity logs

Note: Make sure you are on the username 2: Google Cloud console.

In this task, you'll review the Admin activity logs generated in the previous task. Your goal is to identify and apply filters to isolate logs that may indicate suspicious activity. This will enable you to export this subset of logs and streamline the process of analyzing them for potential issues.

Admin Activity logs record the log entries for API calls or other administrative actions that modify the configuration or metadata of resources. For example, the logs record when VM instances and App Engine applications are created when permissions are changed.

Note: You can view audit log entries in the Logs Viewer, Cloud Logging, and in the Cloud SDK. You can also export audit log entries to Pub/Sub, BigQuery, or Cloud Storage.
  1. In the Google Cloud console, click the Navigation menu (navigation menu icon).
  2. Select Logging > Logs Explorer. The Logs Explorer page opens. (You may need to expand the More Products drop-down menu within the Navigation menu and locate Logging under Operations.)
  3. Ensure that the Show query toggle button is activated. This opens the Query builder field.
  4. Copy and paste the following command into the Query builder field. Notice your Google Cloud project ID, project ID in the command.
logName = ("projects/{{{project_0.project_id | "PROJECT_ID"}}}/logs/cloudaudit.googleapis.com%2Factivity")
  1. Click Run query.
  2. In the Query results, locate the log entry indicating that a Cloud Storage bucket was deleted, it will contain the storage.buckets.delete summary field. Summary fields are included in the log results to highlight important information about the log entry.

This entry refers to storage.googleapis.com, which calls the storage.buckets.delete method to delete a bucket. The bucket name is the same name as your project id: .

  1. Within this entry, click on the storage.googleapis.com text, and select Show matching entries. The Query results should now display only six entries related to created and deleted cloud storage buckets.
  2. In the Query editor field, notice the protoPayload.serviceName="storage.googleapis.com" line was added to the query builder, this filters your query to entries only matching storage.googleapis.com.
  3. Within those query results, click storage.buckets.delete in one of the entries, and select Show matching entries.

Notice another line was added to the Query builder text:

logName = ("projects/{{{project_0.project_id | "PROJECT_ID"}}}/logs/cloudaudit.googleapis.com%2Factivity") protoPayload.serviceName="storage.googleapis.com" protoPayload.methodName="storage.buckets.delete"

The Query results should now display all entries related to deleted Cloud Storage buckets. You can use this technique to easily locate specific events.

  1. In the Query results, expand a storage.buckets.delete event by clicking the expand arrow > next to the line:

buckets_delete_log

  1. Expand the authenticatitonInfo field by clicking the expand arrow > next to the line:

authenticatitonInfo

Notice the principalEmail field which displays the email address of the user account that performed this action which is the user 1 account you used to generate the user activity.

Task 6. Use BigQuery to analyze the audit logs

Note: Make sure you are on the username 2: Google Cloud console.

You've generated and exported logs to a BigQuery dataset. In this task, you'll analyze the logs using the Query editor.

Note: When you export logs to a BigQuery dataset, Cloud Logging creates dated tables to hold the exported log entries. Log entries are placed in tables whose names are based on the entries' log names.
  1. In the Google Cloud console, click the Navigation menu (navigation menu icon).
  2. Click BigQuery.
Note: The Welcome to BigQuery in the Cloud Console message box appears providing links to the quickstart guide and release notes for UI updates. Click Done to proceed.
  1. In the Explorer pane, click the expand arrow beside your project, . The auditlogs_dataset dataset is displayed.
Note: If auditlogs_dataset is not displayed, reload your browser window.

Next, verify that the BigQuery dataset has appropriate permissions to allow the export writer to store log entries.

  1. Click the auditlogs_dataset dataset.

  2. In the auditlogs_dataset toolbar, click the Sharing dropdown menu, and select Permissions.

  3. On the Share permission for "auditlogs_dataset" page, expand the BigQuery Data Editor section.

  4. Confirm that the service account used for log exports is a listed permission. The service account is similar to: service-xxxxxxxx@gcp-sa-logging.iam.gserviceaccount.com

    This permission is assigned automatically when log exports are configured so this is a useful way to check that log exports have been configured.

  5. Click Close to close the Share Dataset window.

  6. In the Explorer pane, click the expander arrow next to the auditlogs_dataset dataset to view the cloudaudit_googleapis_com_acitivty table. This table contains your exported logs.

  7. Select the cloudaudit_googleapis_com_acitivty table. The table schema displays. Take a moment to review the table schema and details.

  8. Expand the Query drop-down menu and select In new tab.

query_drop_down

  1. In the Untitled tab of the query builder, delete any existing text and copy and paste the following command:
SELECT timestamp, resource.labels.instance_id, protopayload_auditlog.authenticationInfo.principalEmail, protopayload_auditlog.resourceName, protopayload_auditlog.methodName FROM `auditlogs_dataset.cloudaudit_googleapis_com_activity_*` WHERE PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND CURRENT_DATE() AND resource.type = "gce_instance" AND operation.first IS TRUE AND protopayload_auditlog.methodName = "v1.compute.instances.delete" ORDER BY timestamp, resource.labels.instance_id LIMIT 1000;

This query returns the users that deleted virtual machines in the last 7 days.

  1. Click Run.

After a couple of seconds, BigQuery will return each time a user deleted a Compute Engine virtual machine within the past 7 days. You should notice one entry, which is the activity you generated in the previous tasks as user 1. Remember, BigQuery shows only the activity that occurred after you created the export.

  1. Replace the previous query in the Untitled tab with the following:
SELECT timestamp, resource.labels.bucket_name, protopayload_auditlog.authenticationInfo.principalEmail, protopayload_auditlog.resourceName, protopayload_auditlog.methodName FROM `auditlogs_dataset.cloudaudit_googleapis_com_activity_*` WHERE PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND CURRENT_DATE() AND resource.type = "gcs_bucket" AND protopayload_auditlog.methodName = "storage.buckets.delete" ORDER BY timestamp, resource.labels.instance_id LIMIT 1000;

This query returns the users that deleted Cloud Storage buckets in the last 7 days. You should notice two entries, which is the activity you generated in the previous tasks as user 1.

  1. Click Run.

The ability to analyze audit logs in BigQuery is very powerful. In this activity, you viewed just two examples of querying audit logs.

Click Check my progress to verify that you have completed this task correctly. Use BigQuery to analyze the audit logs

Conclusion

Great work! You have successfully queried in Logs Explorer. You then exported logs and created a dataset that you analyzed in BigQuery.

You have shown how you can use audit logs and filter for types of malicious activity and then further analyze those logs in BigQuery as a way to analyze the threats.

End your lab

Before you end the lab, make sure you’re satisfied that you’ve completed all the tasks. When you're ready, click End Lab and then click Submit.

Ending the lab will remove your access to the lab environment, and you won’t be able to access the work you've completed in it again.

Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.