The Introduction to Kafka Streams training course is designed to teach the basics of the Apache Kafka distributed streaming platform. The Apache Kafka distributed streaming platform is one of the most powerful and widely used reliable streaming platforms. Kafka is fault-tolerant, highly scalable, and used for log aggregation, stream processing, event sources, and commit logs. Kafka is also used to enable stream processing, online messaging, facilitate in-memory computing by providing a distributed commit log, data collection for big data and so much more.
The course begins with an introduction to Kafka and it's components. The course then covers important concepts and a lesson on programming in Kafka. The course concludes with a look into Schema Registries and streaming.
Purpose
|
Learn to utilize Kafka Streams and identify where Kafka can be further incorporated into practice. |
Audience
|
Developers and developer teams looking to learn to leverage Kafka Streams. |
Role
| Software Developer |
Skill Level
| Introduction |
Style
| Learning Spikes - Workshops |
Duration
| 2 Days |
Related Technologies
| Apache Kafka |
Productivity Objectives
- Describe Kafka and its uses
- Demonstrate an understanding of components such as Topics, Partitions, Brokers Producers, Zookeeper, Consumer, and Controller by creating a personal Kafka development environment
- Create producers and consumers to interact with Kafka
- Use basic Schema Registry
- Write Kafka Streams applications with Java