Getting Started with Apache Kafka and Confluent Platform on Google Cloud
Advanced 4 étapes 5 heures 24 crédits
Organizations around the world rely on Apache Kafka to integrate existing systems in real time and build a new class of event streaming applications that unlock new business opportunities. Google and Confluent are in a partnership to deliver the best event streaming service based on Apache Kafka and to build event driven applications and big data pipelines on Google Cloud Platform. In this Quest, you will first learn how to deploy and create a streaming data pipeline with Apache Kafka. You will then perform hand-on labs on the different functionalities of the Confluent Platform such as deploying and running Apache Kafka on GKE, building a Cloud Run Streaming Application with Apache Kafka on Anthos or developing a Streaming Microservices Application.
- Run Apache Kafka on GKE
- Build a Cloud Run streaming application with Apache Kafka on Anthos
- Create a clickstream data analysis pipeline using ksqlDB
- Develop a streaming microservices application
In this lab, you create a streaming data pipeline with Kafka providing you a hands-on look at the Kafka Streams API. You will run a Java application that uses the Kafka Streams library by showcasing a simple end-to-end data pipeline powered by Apache.
This lab will show you how to set up a Confluent Kafka cluster within Google Kubernetes Engine and how to create streaming events.
This lab will cover how to create a clickstream data analysis pipeline using ksqlDB.
This hands-on lab provides step-by-step instructions for developers to apply the basic principles of streaming applications using the Confluent Platform.