Setup a Streaming Data Pipeline With Apache Kafka on GCP

Setup a Streaming Data Pipeline With Apache Kafka on GCP
Setup a Streaming Data Pipeline With Apache Kafka on GCP

Hi All,

Today , We will go through a hands on labs on GCP. We will Setup a Streaming Data Pipeline With Apache Kafka on Confluent hosted in GCP.

Overview

In this lab, we will create a streaming data pipeline with Kafka providing a hands-on look at the Kafka Streams API. You will run a Java application that uses the Kafka Streams library by showcasing a simple end-to-end data pipeline powered by Apache Kafka.

Objectives

In this lab GSP730, we will:

  • Start a Kafka cluster on a Compute Engine single machine
  • Write example input data to a Kafka topic, using the console producer included in Kafka
  • Process the input data with a Java application called WordCount that uses the Kafka Streams library
  • Inspect the output data of the application, using the console consumer included in Kafka

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

--

--

A Passionate Programmer - A Technology Enthusiast
A Passionate Programmer - A Technology Enthusiast

Written by A Passionate Programmer - A Technology Enthusiast

An Architect practicing Architecture, Design,Coding in Java,JEE,Spring,SpringBoot,Microservices,Apis,Reactive,Oracle,Mongo,GCP,AWS,Kafka,PubSub,DevOps,CI-CD,DSA

No responses yet