menu

Getting Started with Apache Kafka and Confluent Platform on Google Cloud

Advanced 4 个步骤 5 个小时 24 个积分

Organizations around the world rely on Apache Kafka to integrate existing systems in real time and build a new class of event streaming applications that unlock new business opportunities. Google and Confluent are in a partnership to deliver the best event streaming service based on Apache Kafka and to build event driven applications and big data pipelines on Google Cloud Platform. In this Quest, you will first learn how to deploy and create a streaming data pipeline with Apache Kafka. You will then perform hand-on labs on the different functionalities of the Confluent Platform such as deploying and running Apache Kafka on GKE, building a Cloud Run Streaming Application with Apache Kafka on Anthos or developing a Streaming Microservices Application.

目标:

  • Run Apache Kafka on GKE
  • Build a Cloud Run streaming application with Apache Kafka on Anthos
  • Create a clickstream data analysis pipeline using ksqlDB
  • Develop a streaming microservices application

Quest Outline

实验

Creating a Streaming Data Pipeline With Apache Kafka

In this lab, you create a streaming data pipeline with Kafka providing you a hands-on look at the Kafka Streams API. You will run a Java application that uses the Kafka Streams library by showcasing a simple end-to-end data pipeline powered by Apache.

实验

Confluent: Running Apache Kafka on GKE

This lab will show you how to set up a Confluent Kafka cluster within Google Kubernetes Engine and how to create streaming events.

实验

Confluent: Clickstream Data Analysis Pipeline Using ksqlDB

This lab will cover how to create a clickstream data analysis pipeline using ksqlDB.

实验

Confluent: Developing a Streaming Microservices Application

This hands-on lab provides step-by-step instructions for developers to apply the basic principles of streaming applications using the Confluent Platform.

立即注册

注册该挑战任务,系统会全程跟踪进度,直到您赢得徽章为止。