Kafka for Developers Online Course
Kafka for Developers Online Course
This course is designed to teach developers how to efficiently serialize and exchange data between applications using Apache Kafka and AVRO. It covers data serialization concepts, Schema Registry, and Kafka Producers & Consumers while ensuring data compatibility and evolution. Through hands-on coding, you will learn how to integrate Spring Boot with Kafka and AVRO, build real-world applications, and manage schema evolution without disrupting existing consumers. By the end of this course, you will have a practical understanding of how AVRO improves data consistency, compression, and contract enforcement in Kafka-based systems.
Key Benefits
- Gain expertise in data serialization and AVRO schema design.
- Learn how to set up Kafka and Schema Registry for managing message structures.
- Understand schema evolution strategies to maintain backward and forward compatibility.
- Build Spring Boot Kafka Producers and Consumers using AVRO.
- Learn to publish and consume messages with Schema Registry for enforcing data contracts.
- Work on real-world use cases, such as a Coffee Shop Order Service.
- Develop RESTful endpoints that interact with Kafka.
- Get hands-on experience with Gradle and Maven for AVRO schema generation.
Target Audience
This course is ideal for software developers and engineers who want to enhance their knowledge of Kafka and data serialization. It is particularly beneficial for:
- Java developers working with event-driven architectures.
- Backend engineers handling message-based communication.
- Data engineers managing real-time data pipelines.
- Developers looking to implement Schema Registry for enforcing data contracts.
Prerequisites
- Prior experience with Java programming.
- Basic understanding of Kafka Producers and Consumers.
- Familiarity with command-line interfaces and Docker is helpful.
- Knowledge of Spring Boot is recommended but not mandatory.
Learning Objectives
- Understand the importance of data serialization in distributed systems.
- Learn about different serialization formats and their applications in Kafka.
- Gain expertise in AVRO schema creation and its advantages over other formats.
- Set up Kafka and Schema Registry in a local environment.
- Build Kafka Producers and Consumers that use AVRO serialization.
- Explore the internal structure of AVRO records.
- Learn schema evolution techniques for maintaining compatibility.
- Understand naming strategies in Schema Registry and their use cases.
- Develop a real-world Coffee Order Service with Spring Boot, Kafka, and AVRO.
- Implement REST APIs to publish and consume Kafka messages.
Course Outline
The Kafka for Developers Exam covers the following topics -
Domain 1. Introduction and Course Overview
- Introduction to the course and what you will learn.
- Understanding prerequisites required before starting.
Domain 2. Understanding Data Contracts and Serialization in Kafka
- Exploring data contracts and why they are critical in Kafka.
- Overview of various serialization techniques used in Kafka.
Domain 3. Introduction to AVRO and Its Importance
- Understanding AVRO and why it is a preferred serialization format.
- Writing a simple AVRO schema to structure data.
Domain 4. Setting Up Kafka Locally with Docker
- Installing Kafka and Zookeeper using Docker Compose.
- Sending and receiving messages using Kafka CLI tools.
- Producing and consuming messages using AVRO console tools.
Domain 5. Greeting App: Setting Up the AVRO Project Using Gradle
- Creating a base project for the Greeting App.
- Generating Java classes from AVRO schema using Gradle.
Domain 6. Greeting App: Setting Up the AVRO Project Using Maven
- Setting up the base project using Maven.
- Generating Java classes from AVRO schema using Maven.
Domain 7. Building Kafka Producer and Consumer with AVRO
- Developing a Kafka Producer that sends AVRO messages.
- Creating a Kafka Consumer to process AVRO messages.
Domain 8. Real-World Use Case: Coffee Shop Order Service with AVRO
- Overview of the Coffee Shop Order Service architecture.
- Setting up the project using Gradle and Maven.
- Designing a schema for coffee orders using AVRO.
- Generating Java classes from AVRO schema with Gradle and Maven.
- Developing a Kafka Producer to publish coffee orders.
- Developing a Kafka Consumer to process coffee orders.
Domain 9. Working with Logical Schema Types in AVRO
- Introduction to logical types in AVRO.
- Adding timestamp and decimal types to the Coffee Order schema.
- Using UUID as a unique identifier for coffee orders.
- Implementing the date logical type in AVRO.
Domain 10. Exploring AVRO Records – Internal Structure
- A deep dive into what makes up an AVRO record.
Domain 11. Schema Evolution in AVRO – Understanding Issues Without Schema Registry
- Exploring the challenges of schema changes when Schema Registry is not used.
- Understanding how schema evolution affects Kafka Consumers.
Domain 12. Getting Started with Schema Registry
- What is Schema Registry, and why is it important?
- Publishing and consuming records using Schema Registry.
- Understanding Schema Registry internals and interacting through REST APIs.
- Publishing and consuming keys as AVRO records.
Domain 13. Schema Evolution with Schema Registry
- Understanding schema evolution and its impact on applications.
- Updating Gradle and Maven to work with Maven Local Repository.
- Managing Backward Compatibility: Removing a field from schema.
- Managing Forward Compatibility: Adding a field to schema.
- Maintaining Full Compatibility: Handling optional fields.
- Understanding No Compatibility: Modifying field names.
Domain 14. Understanding Schema Naming Strategies
- Learning different naming strategies in Schema Registry.
- Creating an AVRO schema for coffee order updates.
- Producing and consuming update events using RecordNameStrategy.
Domain 15. Developing a Complete Coffee Order Service with Spring Boot and Schema Registry
- Overview of the Spring Boot-based Coffee Order Service.
- Setting up the base project using Gradle and Maven.
- Designing DTOs (Data Transfer Objects) for the Coffee Order Service.
- Creating a POST endpoint to accept coffee orders.
- Implementing a service layer to convert DTOs into AVRO objects.
- Configuring Kafka Producer properties for the service.
- Developing a Kafka Producer to send coffee order events.
- Developing a Kafka Consumer to process coffee order messages.
- Implementing a PUT endpoint to update existing coffee orders.