Introduction Video


Description

In this training you learn all the basics you need to start working with Apache Kafka. Learn how you can set up your queue and how to write message producers and consumers. Once you go through this training it will be easy for you to work with Kafka and to understand how similar tools on the cloud platforms work.



Sections

Kafka & Message Queue Basics

After going through the basics part, you understand what Kafka is, how it is integrated into a Data Science platform and how it works, especially with event and stream processing. You learn about the basic moving parts of Kafka, like topics, messages and consumer groups, and how they are connected. Furthermore, learn how a message queue looks and works, how data is produced and consumed and why orders and delivery guarantees are important for message queues.



Apache Kafka Parts

We also dig a bit deeper into certain Apache Kafka parts. Here you learn about topic partitions, how they are intertwined with Brokers and how data is processed within these parts. I will also explain to you what the Zookeeper service is and how it works with the Kafka Brokers and the metadata.



Development Environment

In this part, you learn how to set up and run Kafka on Windows via Docker. I am going through the several steps, lead you through the setup of the Bitnami Docker Container and give you some tips for a successful environment installation. 



Working with Kafka

In this practical part, we are actually working with Kafka. Here you learn how to set up a topic. You get to know the basic topic commands and how to work with them. Also, learn how to create the producer for writing messages into Kafka and how to generate a consumer. I will also show you how you can test both of them via Python and how you can successfully work with and check your consumer offsets by using the offset checker.



Kafka in Data Platforms & Conclusion

In the end, we talk about how Kafka can be implemented in Data Science platforms. Therefore, I brought you three use cases: Ingestion pipeline, multiple processings as consumers, and multistage stream processing. This way you can start implementing message queues immediately in your daily work.

Training Curriculum



  Apache Kafka fundamentals course
Available in days
days after you enroll
  Kafka & message queue basics
Available in days
days after you enroll
  Apache Kafka parts
Available in days
days after you enroll
  Development environment
Available in days
days after you enroll
  Working with Kafka
Available in days
days after you enroll
  Kafka in data platforms & conclusion
Available in days
days after you enroll