arrow_back

Route Datadog Monitoring Alerts to Google Cloud with Eventarc

Join Sign in
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

Route Datadog Monitoring Alerts to Google Cloud with Eventarc

Lab 1 hour 30 minutes universal_currency_alt 1 Credit show_chart Introductory
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

This lab was developed with our partner, Datadog. Your personal information may be shared with Datadog, the lab sponsor, if you have opted-in to receive product updates, announcements, and offers in your Account Profile.

GSP1168

Google Cloud self-paced labs logo

Overview

Eventarc makes it easy to connect Google Cloud services with events from a variety of sources. It allows you to build event-driven architectures in which microservices are loosely coupled and distributed. It also takes care of event ingestion, delivery, security, authorization, and error-handling for you which improves developer agility and application resilience.

Datadog is a monitoring and security platform for cloud applications. It brings together end-to-end traces, metrics, and logs to make your applications, infrastructure, and third-party services observable.

Objectives

In this lab you will learn about using the Helm chart used to install the Datadog Agent. You will learn to:

  • Discover the Datadog provider.
  • Set up a channel to the Datadog provider.
  • Create a workflow to log events.
  • Create an Eventarc trigger with the channel.
  • Create a Datadog monitor.
  • Test the Datadog monitor, Eventarc trigger and the workflow.
  • Enable Datadog's Google Cloud integration.
  • Create a workflow to check Compute Engine VMs.
  • Connect Datadog monitoring alerts to Workflows with Eventarc.
  • Create a Datadog monitor and alert on VM deletions.

Setup and Requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details panel.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details panel.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To view a menu with a list of Google Cloud products and services, click the Navigation menu at the top-left. Navigation menu icon

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell Activate Cloud Shell icon at the top of the Google Cloud console.

When you are connected, you are already authenticated, and the project is set to your Project_ID, . The output contains a line that declares the Project_ID for this session:

Your Cloud Platform project in this session is set to {{{project_0.project_id | "PROJECT_ID"}}}

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:
gcloud auth list
  1. Click Authorize.

Output:

ACTIVE: * ACCOUNT: {{{user_0.username | "ACCOUNT"}}} To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:
gcloud config list project

Output:

[core] project = {{{project_0.project_id | "PROJECT_ID"}}} Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Task 1: Set up Cloud Shell

  1. In Cloud Shell, set your project ID and save it as the PROJECT_ID variable.

Also, set a REGION variable to the default region as described below. This is the region you will create resources in later.

PROJECT_ID={{{project_0.project_id|Project ID}}} REGION={{{project_0.default_region|Default Region}}} gcloud config set core/project $PROJECT_ID

Task2: Create Compute Engine VMs

You will start with some Compute Engine Virtual Machines (VMs). You will not be using them right away. Later in the lab you will use Datadog Google Cloud integration to monitor these VMs.

  1. Create 2 Compute Engine VMs:
gcloud compute instances create instance-1 instance-2 --machine-type=e2-medium --zone {{{project_0.default_zone|Default Zone}}} --scopes=https://www.googleapis.com/auth/cloud-platform

You should see VMs created and running in a minute or so in the Cloud Console.

Click Check my progress to verify the objective. Create Compute Engine VMs

Task 3: Enable APIs

  1. Enable all necessary services:
gcloud services enable \ eventarc.googleapis.com \ eventarcpublishing.googleapis.com \ workflows.googleapis.com \ workflowexecutions.googleapis.com \ compute.googleapis.com \ cloudasset.googleapis.com \ monitoring.googleapis.com

Click Check my progress to verify the objective. Enable APIs

Task 4: Set up a Datadog Trial account

If you already have a trial account set up, you can use that. It is recommended that you do not use your production Datadog account to avoid cluttering the environment with test and training assets.

  1. Navigate to https://us5.datadoghq.com/signup and enter your name, email, company, and a password. Make sure United States (US5-Central) is selected.

  2. On the next page, close the Datadog sign up workflow by clicking on the Datadog icon.

    datadog icon

Task 5: Discover the Datadog provider

An Eventarc provider is a service or entity that can emit events directly to Google Cloud which are then routed to your project. Third-party providers, such as Datadog, are non-Google Cloud providers that are integrated with Google Cloud through Eventarc.

  1. In Cloud Shell, run the following command to see the list of Google Cloud and third-party providers:
gcloud eventarc providers list

This lists Google Cloud and third-party providers and the locations they are available in:

... NAME: storage.googleapis.com LOCATION: asia NAME: cloudaudit.googleapis.com LOCATION: asia NAME: pubsub.googleapis.com LOCATION: asia ...

You can narrow down the list to third-party providers with this command:

gcloud eventarc providers list --filter='eventTypes.type!~^google*'

You should see Datadog in the list:

NAME: datadog LOCATION: asia-northeast1 NAME: datadog LOCATION: europe-west4 NAME: datadog LOCATION: us-central1 NAME: datadog LOCATION: us-east1 NAME: datadog LOCATION: us-west1

You can also describe the Datadog provider to see the events it supports:

gcloud eventarc providers describe datadog --location $REGION displayName: Datadog eventTypes: - filteringAttributes: - attribute: monitor_id description: The ID of the monitor that triggered the alert. - attribute: alert_type description: The type of the fired alert. - attribute: info description: Additionl information about the alert. - attribute: msg_title description: The title of the alert. type: datadog.v1.alert name: projects/qwiklabs-gcp-00-21fa5d7fc239/locations/us-central1/providers/datadog

You need to set up a channel to integrate your project with a provider. This involves creating a channel, retrieving channel details and sending those details to the provider. Once the provider has initialized the connection to the channel, the provider can start sending events to your project.

Create a channel

  1. Create a channel for the Datadog provider.

You can do it using gcloud:

CHANNEL_NAME=datadog-channel gcloud eventarc channels create $CHANNEL_NAME \ --provider datadog \ --location $REGION

Or create it from the Channels section of the Eventarc page in Google Cloud Console:

d03a7173b25e5ea6.png

Retrieve channel details

  1. Once the channel is created, retrieve the details of the channel from gcloud:
gcloud eventarc channels describe $CHANNEL_NAME --location $REGION

The output should be similar to the following:

Creating channel [datadog-channel] in project [qwiklabs-gcp-00-21fa5d7fc239], location [us-central1]...done. student_01_1be071dfc722@cloudshell:~ (qwiklabs-gcp-00-21fa5d7fc239)$ gcloud eventarc channels describe $CHANNEL_NAME --location $REGION activationToken: 1XkE7RCKqHJJargYFUWUnsxYn5GxmGS14ZPXtB5hdE7ysYDfd5y7wulNWJ7l1iTQIab0UepCzpxsDknFtwsIDaoQinifU1F4SF3FUVE createTime: '2024-01-03T19:37:11.249013965Z' name: projects/qwiklabs-gcp-00-21fa5d7fc239/locations/us-central1/channels/datadog-channel provider: projects/qwiklabs-gcp-00-21fa5d7fc239/locations/us-central1/providers/datadog pubsubTopic: projects/qwiklabs-gcp-00-21fa5d7fc239/topics/eventarc-channel-us-central1-datadog-channel-348 state: PENDING uid: bbf8a4da-43e3-45b9-a4c7-2ed4d1c134c9 updateTime: '2024-01-03T19:37:18.633193289Z'

Similarly, you can see the channel from Google Cloud Console:

80d8b2b6a46cb4c5.png Note the channel full name and the activation token. You will need to send them to Datadog next.

The channel state indicates the channel's status. It can be one of the following:

  • PENDING—The channel has been created successfully and there is an activation token available to create a connection with the provider. To change the state of the channel from PENDING to ACTIVE, the token must be given to the provider and used to connect the channel within 24 hours of the channel's creation.
  • ACTIVE—The channel has been successfully connected with the provider. An ACTIVE channel is ready to receive and route events from the provider.
  • INACTIVE—The channel cannot receive events nor be reactivated. The provider is either disconnected from this channel or the channel activation token has expired and the provider isn't connected. To re-establish a connection with a provider, you must create a new channel for the provider.

An activation token is a single-use, time-restricted token, used to create a connection between a provider and a subscriber's project. Only a specific provider, selected during the channel creation process can use the token. The token is valid for 24 hours after the channel's creation. After 24 hours, the channel becomes INACTIVE.

Send channel details to the provider

You need to send the following channel details to the Datadog provider:

  • Channel name (eg. projects/qwiklabs-gcp-04-4e022c25f345/locations/us-east1/channels/datadog-channel)
  • Activation token (eg. so5g4Kdasda7y2MSasdaGn8njB2)
  1. Login to Datadog, Go to Integrations page and search for Google Eventarc. Click the Configure button. If it is not installed, click on the Install button.
9736bd517e1fa19a.png
  1. In the Configuration tab of the Google Eventarc integration, click on + Add New. Enter the full channel name and the activation token you copied from Cloud Shell:
a17f522075cdd92e.png

You should now see the channel in the list of channels and after a few seconds, you should also see the channel become active in Google Cloud Console:

8399d528ccbd4c20.png

Now, you're ready to use the channel!

Click Check my progress to verify the objective. Create Eventarc channel

Task 6: Create a simple workflow

You need a destination in Google Cloud to receive events from the provider. Eventarc supports a number of event destinations such as Cloud Run, Workflows, Kubernetes services. In this case, deploy a workflow to simply log the received events.

  1. Paste the following in Cloud Shell to create a new file called workflow-datadog1.yaml:
cat << "EOF" > workflow-datadog1.yaml main: params: [event] steps: - logStep: call: sys.log args: data: ${event} EOF

Note that the workflow is receiving an event as a parameter. This event will come from Datadog monitoring via Eventarc. Once the event is received, the workflow simply logs the received event.

  1. Deploy the workflow:
WORKFLOW_NAME=workflow-datadog1 gcloud workflows deploy $WORKFLOW_NAME \ --source workflow-datadog1.yaml \ --location $REGION

The workflow is deployed but it's not running yet. It will be executed by an Eventarc trigger when a Datadog alert is received.

Click Check my progress to verify the objective. Create a simple workflow

Task 7: Create an Eventarc trigger

You are now ready to connect events from the Datadog provider to Workflows with an Eventarc trigger.

Configure service account

You need a service account with the eventarc.eventReceiver role when creating a trigger. You can either create a dedicated service account or use the default compute service account.

  1. For simplicity, use the default compute service account and grant the eventarc.eventReceiver role:
PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)') gcloud projects add-iam-policy-binding $PROJECT_ID \ --member serviceAccount:$PROJECT_NUMBER-compute@developer.gserviceaccount.com \ --role roles/eventarc.eventReceiver

Create a trigger

  1. Create a trigger with the Datadog channel, event type and also a workflow destination:
gcloud eventarc triggers create datadog-trigger1 \ --location $REGION \ --destination-workflow $WORKFLOW_NAME \ --destination-workflow-location $REGION \ --channel $CHANNEL_NAME \ --event-filters type=datadog.v1.alert \ --service-account $PROJECT_NUMBER-compute@developer.gserviceaccount.com
  1. You can list the triggers to see that the newly created trigger is active:
gcloud eventarc triggers list --location $REGION

The output will look like this:

NAME: datadog-trigger1 TYPE: datadog.v1.alert DESTINATION: Workflows: workflow-datadog1 ACTIVE: Yes

Click Check my progress to verify the objective. Create Eventarc trigger

Task 8: Create a Datadog monitor

You will now create a Datadog monitor and connect it to Eventarc.

It will be a Hello World type monitor with default values. You will manually trigger it to generate the monitoring alerts which in turn will generate an Eventarc event in Google Cloud.

  1. To create a monitor in Datadog, log in to Datadog. Hover over Monitors in the main menu and click New Monitor in the sub-menu. There are many monitor types. Choose the Metric monitor type.

  2. In the New Monitor page, leave the defaults for steps 1 and 2.

  • In step 3, set Alert threshold to 1
  • In step 4, set Test monitor for Eventarc as the monitor name and set Notify your services and your team members to @eventarc__your-region_your-channel-name

Keep the monitor page open for the next step where you will test the monitor.

Task 9: Test monitor and trigger

To test the Datadog monitor and the Eventarc trigger, you will manually trigger the monitor.

  1. At the bottom of the monitor creation page, click on the Test Notifications button:
32ccf1cc47b01150.png
  1. Then, click on the Run Test button:
7f2ff70ec673007b.png

This should simulate the state transition in the monitor and trigger an Eventarc event.

  1. Back in the Google Cloud Console, from the Navigation Menu, go to the Workflows page. Check the workflow-datadog1 workflow. You should see that there's a new execution:
adf7cd97ca2e8da7.png
  1. Check the details of the execution. You should see the Datadog event type datadog.v1.alert generated from the monitoring alert in the input of the workflow and also in the logs:
dc78fd8460e5fc0e.png

Click Check my progress to verify the objective. Test Datadog monitor and Eventarc trigger Execution

Task 10: Enable Datadog's Google Cloud integration

To use Datadog to monitor a project, you need to enable APIs needed for Datadog, create a service account, and connect the service account to Datadog.

Create a service account

Datadog's Google Cloud integration uses a service account to make calls to the Cloud Logging API to collect node-level metrics from your Compute Engine instances.

  1. In Cloud Shell, create a service account for Datadog:
DATADOG_SA_NAME=datadog-service-account gcloud iam service-accounts create $DATADOG_SA_NAME \ --display-name "Datadog Service Account"
  1. Enable the Datadog service account to collect metrics, tags, events, and user labels by granting the following IAM roles:
DATADOG_SA_EMAIL=$DATADOG_SA_NAME@$PROJECT_ID.iam.gserviceaccount.com gcloud projects add-iam-policy-binding $PROJECT_ID \ --member serviceAccount:$DATADOG_SA_EMAIL \ --role roles/cloudasset.viewer gcloud projects add-iam-policy-binding $PROJECT_ID \ --member serviceAccount:$DATADOG_SA_EMAIL \ --role roles/compute.viewer gcloud projects add-iam-policy-binding $PROJECT_ID \ --member serviceAccount:$DATADOG_SA_EMAIL \ --role roles/monitoring.viewer Note: These steps came directly from Datadog's documentation. For more information, visit https://docs.datadoghq.com/integrations/google_cloud_platform/#setup

Add the Datadog principal to your service account

  1. In Datadog, navigate to Integrations, search for Google Cloud Platform and select it.

  2. Click on Add GCP Account. If you have no configured projects, you are automatically redirected to this page.

  3. If you have not generated a Datadog principal for your org, click the Generate Principal button.

  4. Copy your Datadog principal to the clipboard and keep it for the next section.

  5. Back in the Google Cloud console, under the Service Acounts menu, find the service account you created in the first section.

  6. Go to the Permissions tab and click on Grant Access.

  7. Paste your Datadog principal into the New principals text box.

  8. Assign the role of Service Account Token Creator and click Save.

Click Check my progress to verify the objective. Enable Datadog's Google Cloud integration

Note: If you previously configured access using a shared Datadog principal, you can revoke the permission for that principal after you complete these steps.

Complete the integration setup in Datadog

  1. In your Google Cloud console, navigate to the Service Account > Details tab for the service account you created. There, you can find the email associated with this Google service account. It resembles <sa-name>@<project-id>.iam.gserviceaccount.com. Copy this email.

  2. Return to the integration configuration tile in Datadog (where you copied your Datadog principal in the previous section).

  3. In the box under Add Service Account Email, paste the email you previously copied.

  4. Click on Verify and Save Account.

Task 11: Create a workflow

Now that you have 2 VMs running, create a workflow that will respond to alerts from a Datadog monitor. The workflow can be as sophisticated as you like but in this case, the workflow will check the number of VM instances running and if it falls below 2, it will create new VM instances to make sure there are 2 VMs running at all times.

  1. Copy the following contents and paste them in Cloud Shell to create a workflow-datadog2.yaml file:
cat << "EOF" > workflow-datadog2.yaml main: params: [event] steps: - init: assign: - projectId: ${sys.get_env("GOOGLE_CLOUD_PROJECT_ID")} - zone: "{{{project_0.default_zone|Default Zone}}}" - minInstanceCount: 2 - namePattern: "datadog-instance-##" - listInstances: call: googleapis.compute.v1.instances.list args: project: ${projectId} zone: ${zone} result: listResult - getInstanceCount: steps: - initInstanceCount: assign: - instanceCount: 0 - setInstanceCount: switch: - condition: ${"items" in listResult} steps: - stepA: assign: - instanceCount: ${len(listResult.items)} - findDiffInstanceCount: steps: - assignDiffInstanceCount: assign: - diffInstanceCount: ${minInstanceCount - instanceCount} - logDiffInstanceCount: call: sys.log args: data: ${"instanceCount->" + string(instanceCount) + " diffInstanceCount->" + string(diffInstanceCount)} - endEarlyIfNeeded: switch: - condition: ${diffInstanceCount < 1} next: returnResult - bulkInsert: call: googleapis.compute.v1.instances.bulkInsert args: project: ${projectId} zone: ${zone} body: count: ${diffInstanceCount} namePattern: ${namePattern} instanceProperties: machineType: "e2-micro" disks: - autoDelete: true boot: true initializeParams: sourceImage: projects/debian-cloud/global/images/debian-10-buster-v20220310 networkInterfaces: - network: "global/networks/default" result: bulkInsertResult - returnResult: return: ${bulkInsertResult} EOF

Note that the workflow is receiving an event as a parameter. This event will come from Datadog monitoring via Eventarc. Once the event is received, the workflow checks the number of running instances and creates new VM instances, if needed.

  1. Deploy the workflow:
WORKFLOW_NAME=workflow-datadog2 gcloud workflows deploy $WORKFLOW_NAME \ --source workflow-datadog2.yaml \ --location $REGION

The workflow is deployed but it's not running yet. It will be executed by an Eventarc trigger when a Datadog alert is received.

Click Check my progress to verify the objective. Create a workflow

Task 12: Create an Eventarc trigger for Compute

You are now ready to connect events from the Datadog provider to Workflows with an Eventarc trigger. You will use the channel and the service account you set up in the first codelab.

  1. Create a trigger with the Datadog channel, event type and also a workflow destination:
PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)') gcloud eventarc triggers create datadog-trigger2 \ --location $REGION \ --destination-workflow $WORKFLOW_NAME \ --destination-workflow-location $REGION \ --channel $CHANNEL_NAME \ --event-filters type=datadog.v1.alert \ --service-account $PROJECT_NUMBER-compute@developer.gserviceaccount.com

You can list the triggers to see that the newly created trigger is active:

gcloud eventarc triggers list --location $REGION NAME: datadog-trigger2 TYPE: datadog.v1.alert DESTINATION: Workflows: workflow-datadog2 ACTIVE: Yes

Click Check my progress to verify the objective. Create an Eventarc trigger for Compute

Task 13: Create a Datadog monitor for compute

You will now create a Datadog monitor and connect it to Eventarc.

The monitor will check the number of Compute Engine VMs running and alert if it falls below 2.

  1. To create a monitor in Datadog, log in to Datadog. Hover over Monitors in the main menu and click New Monitor in the sub-menu. There are many monitor types. Choose the Integration monitor type, and then select the Google Cloud Platform tile.

  2. In the New Monitor page, create a monitor with the following:

  • Choose the detection method: Threshold Alert.

  • Define the metric: gcp.gce.instance.is_running from (everywhere) sum by (everything)

  • Evaluate the minimum of the query over the last 5 minutes

  • Set alert conditions:

    • Trigger when the metric is below the threshold
    • Alert threshold: < 2
    • Example Monitor name: Compute Engine instances < 2
    • Notify your services and your team members: @eventarc_<ql-variable key="project_0.project_id" placeHolder="PROJECT_ID"></ql-variable>_your-region_your-channel-name
  1. Now, click Create at the bottom to create the monitor.

Task 14: Test monitor and trigger

  1. To test the Datadog monitor, the Eventarc trigger and eventually the workflow, you will delete one of the VMs:
gcloud compute instances delete instance-2 --zone {{{project_0.default_zone|Default Zone}}}

After a few seconds, you should see the instance deleted in Google Cloud Console.

There's a bit of latency for this change to show up in Datadog. After some time (typically 10 mins), you should see the monitor in Datadog to detect and alert this under Manage Monitors section:

50f93d560b6c1973.png Tip: If you don't want to wait for Datadog monitor to alert, you can manually trigger the alert by going to the monitor page, editing the monitoring and clicking on the `Test Notifications` button.

Once the Datadog monitor alerts, you should see that alert go to Workflows via Eventarc. If you check the logs of Workflows, you should see that Workflows checks to find out the difference between the current instance count and expected instance count:

2022-09-28 09:30:53.371 BST instanceCount->1 diffInstanceCount->1

It responds to that alert by creating a new VM instance with datadog-instance-## prefix.

In the end, you will still have 2 VMs in your project, one you created initially and the other one created by Workflows after the Datadog alert!

11e42028e7142690.png Note: you may get a deprecation warning, you can ignore that.

Click Check my progress to verify the objective. Test Datadog monitor and trigger

Congratulations

In this lab you got hands-on experience using the Datadog Monitors and Google Eventarc.

Next Steps / Learn More

Be sure to check out the following labs for more practice with Datadog:

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated April 19, 2024

Lab Last Tested April 19, 2024

Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.