Booking options
£74.99
£74.99
On-Demand course
5 hours 32 minutes
All levels
This course is a mix of theory and coding to give you experience in building Kafka applications using AVRO and Schema Registry. You will code and build a coffee order service using Spring Boot and Schema Registry. Anyone interested in learning about Schema Registry and how to build Kafka Producer and Consumer applications that interact with Schema Registry can take up the course.
The course begins with an introduction that provides an overview of what to expect from it. We will cover the relationship between serialization and Kafka, and the benefits it provides to the overall Kafka architecture. You will gain an understanding of the different serialization formats and the support for Schema in AVRO, Protobuf, and Thrift. You will be introduced to AVRO and why AVRO is popular to work with Kafka and Schema Registry. Further in this course, we will set up Kafka in local and produce and consume messages using Kafka Console Producer and Consumer. You will set up the base project for the greeting app, which you can use to generate Java classes from the greetings schema using the Gradle build tool. You will also understand how to set up the base project for the greeting app, which we can use to generate Java classes from the greetings schema using the Maven build tool. You will understand the different techniques of evolving a Schema with the changing business requirements. In further sections, you will code and build a Spring Boot Kafka application that exchanges the data in an AVRO format and interacts with Schema Registry for data evolution. You will also build a RESTful service to publish the events where we receive events through the REST interface and then publish them to Kafka. By the end of this course, you will have a complete understanding of how to use AVRO as a data serialization format and help you understand the evolution of data using Schema Registry. All resources and code files are placed here: https://github.com/PacktPublishing/Kafka-for-Developers---Data-Contracts-Using-Schema-Registry
Understand the fundamentals of data serialization
Understand the different serialization formats available
Consume AVRO records using Kafka Producer
Publish AVRO records using Kafka Producer
Enforce data contracts using Schema Registry
Use Schema Registry to register the AVRO Schema
This course is suitable for experienced Java developers and developers interested in learning AVRO and how to exchange data between applications using AVRO and Kafka.
This can also be opted for by developers who are interested in learning about Schema Registry and how it fits into Kafka and those developers who are interested in learning techniques to evolve the data. Prior understanding of Java and experience building Kafka Producer is a must to take this course.
This is a pure hands-on oriented course where you will be learning the concepts through code. This course is structured to give you theoretical and coding experience of building Kafka applications using AVRO and Schema Registry.
Introduction to AVRO and its advantages of using them for sharing messages between applications * Learn how Kafka Producer and Consumer interact with the Schema Registry * Build Spring Boot Kafka Producer and Consumer applications that use AVRO as a serialization format
https://github.com/PacktPublishing/Kafka-for-Developers---Data-Contracts-Using-Schema-Registry
Dilip Sundarraj is a software engineer who has experience with building software since 2008. He is passionate about learning modern technologies and staying up to date with all the modern technologies, tools, frameworks, and more. He loves to share his knowledge with the world and that is one of the key reasons for him to be in the online teaching industry.
He loves interacting with other software developers and believes that this helps him share knowledge and learn from them. During his leisure time, he loves to play cricket, watch movies, and work out in the gym to have a balance between physical and mental strength.
Dilip has a YouTube channel named Code with Dilip, where he has been sharing a lot of technical content related to languages, frameworks, best practices, and more.
1. Getting Started with the Course
1. Introduction In this video, you will be introduced to the course and what to expect out of the course. |
2. Prerequisites In this video, you will cover the prerequisites that are needed for this course. |
2. Data Contract and Serialization in Kafka
1. Data Contract and Serialization in Kafka In this video, we will investigate how serialization is connected to Kafka and how it benefits the overall Kafka architecture. |
2. Serialization Formats In this video, we will investigate different serialization formats and the support for Schema in AVRO, Protobuf, and Thrift. |
3. Introduction to AVRO - A Data Serialization System
1. Introduction to AVRO - What Is AVRO and Why AVRO? In this video, you will learn about AVRO and why AVRO is one of the popular serialization formats. |
2. Build a Simple AVRO Schema In this video, you will learn to build a simple AVRO schema. |
4. Kafka Setup and Demo in Local Using Docker
1. Set Up Kafka Broker and Zookeeper Using Docker Compose In this video, you will learn to set up Kafka Broker and Zookeeper using Docker Compose. |
2. Producer and Consumer Messages Using CLI In this video, you will learn to produce and consume messages using Docker Compose. |
3. Produce and Consume Using AVRO Console Producer and Consumer In this video, you will learn to produce and consume AVRO messages using AVRO Console Producer and Consumer. |
5. Greeting App - Base AVRO Project Setup - Gradle
1. Base Project Setup for Greeting App In this video, we will set up the base project for the course using the Gradle build tool. |
2. Generate AVRO Java Records Using AVRO Schema Files In this video, you will learn how to generate Java AVRO records using AVRO Schema. |
6. Greeting App - Base AVRO Project Setup - Maven
1. Base Project Setup for Greeting App - Maven In this video, we will set up the base project for the course using the Gradle build tool. |
2. Generate AVRO Java Records Using AVRO Schema Files - Maven In this video, you will learn how to generate Java AVRO records using AVRO Schema. |
7. Build AVRO Producer and Consumer in Java
1. Let's Build AVRO Kafka Producer In this video, you will learn to build a Kafka Producer to publish AVRO records into the Kafka topic. |
2. Let's Build AVRO Kafka Consumer In this video, you will learn to build a Kafka Consumer to consume AVRO records from the Kafka topic. |
8. Coffee Shop Order Service Using AVRO - A Real-Time Use Case
1. Application Overview In this video, you will get an overview of the application that we are going to build in this section. |
2. Project Setup for Coffee Shop - Gradle In this video, we will set up the base project for the coffeeshop order service using Gradle. |
3. Project Setup for Coffee Shop - Maven In this video, we will set up the base project for the coffeeshop order service using Maven. |
4. Build a Coffee Order Schema Using AVRO In this video, we will create the AVRO Schema for the coffee order service. |
5. Generating AVRO Classes Using Gradle In this video, we will code and generate AVRO classes using Gradle. |
6. Generating AVRO Classes Using Maven In this video, we will code and generate AVRO classes using Maven. |
7. Build a Coffee Shop Order Producer In this video, we will code and produce the coffee order AVRO record to a Kafka topic. |
8. Build a Coffee Shop Order Consumer In this video, we will code and consume the coffee order AVRO record from the Kafka topic. |
9. Logical Schema Types in AVRO
1. Introduction to Logical Types in AVRO In this video, we will code and learn about the logical types in AVRO and how to use them in your project. |
2. Add a Timestamp, Decimal Logical Type to the CoffeeOrder Schema In this video, you will learn to add a timestamp and decimal field to the CoffeeOrder Schema. |
3. Adding the UUID as Key for CoffeeOrder In this video, we will code and learn to use UUID as a key column in our CoffeeOrder Producer. |
4. Date Logical Type In this video, we will code and learn the date logical types. |
10. AVRO Record- Under the Hood
1. What's Inside an AVRO Record? In this video, you will learn about what's inside the AVRO record. |
11. Schema Changes in AVRO - Issues without Schema Registry
1. Evolving the Schema - Consumer Fails to Read the New Schema Let's add a new field to the existing CoffeeOrder Schema and understand the behavior of the consumer app. |
12. Introduction to Schema Registry
1. Introduction to Schema Registry In this video, you will be introduced to Schema Registry and how the Producer and Consumer interacts with Schema Registry. |
2. Publish and Consumer Record Using Schema Registry In this video, we will code and learn about how the Producer and Consumer interacts with Schema Registry. |
3. Schema Registry Internals and Interacting with Schema Registry Using REST Endpoint In this video, you will learn to interact with Schema Registry using the REST Client tool, Insomnia. |
4. Publish and Consume "Key" as an AVRO Record In this video, you will learn to produce keys that are also an AVRO record. |
13. Data Evolution Using Schema Registry
1. Data Evolution and Schema Evolution In this video, we will explain the data lifecycle and schema evolution in Schema Registry. |
2. Update the Code to Interact with Maven Local Repository - Gradle In this video, we will code and learn about how to integrate our code to work with the Maven local repository. |
3. Update the Code to Interact with Maven Local Repository - Maven In this video, we will code and learn about how to integrate our code to work with the Maven local repository. |
4. Deleting a Field in Schema - BACKWARD Compatibility In this video, we will code and learn about BACKWARD compatibility in Schema Registry. |
5. Adding a New Field in Schema - FORWARD Compatibility In this video, we will code and learn about FORWARD compatibility in Schema Registry. |
6. Add/Delete Optional Fields - FULL Compatibility In this video, we will code and learn about FULL compatibility in Schema Registry. |
7. Modify Field Names - NONE Compatibility In this video, we will code and learn about NONE compatibility in Schema Registry. |
14. Schema Naming Strategies
1. Different Types of Naming Strategies In this video, we will explore the different naming strategies that are available in Schema Registry. |
2. Coffee Update Event AVRO Schema In this video, we will code and implement the AVRO Schema for the CoffeeOrder update event. |
3. Publish and Consume CoffeeOrder UpdateEvent Using RecordNameStrategy In this video, we will code and implement the CoffeeUpdate event functionality into our Kafka Producer. |
15. Build a Coffee Order Service Using Spring Boot and Schema Registry
1. Overview of the App In this video, you will get an overview of the Spring Boot Kafka application that we are going to build in this section. |
2. Setting Up the Base Project - Gradle In this video, we will set up the base Spring Boot project using Gradle. |
3. Setting Up the Base Project - Maven In this video, we will set up the base Spring Boot project using Maven. |
4. Build the DTOs for CoffeeOrderService In this video, we will build the DTOs for the coffee order service. |
5. Build the POST Endpoint for the CoffeeOrderService - /coffee_orders In this video, we will build the POST endpoint in the controller, using which we can post new coffee orders. |
6. Build the Service Layer to Map the DTO to AVRO Domain Object In this video, we will build the service layer for the coffee-orders-service, which is going to behave as a transformational layer to transform objects from DTO to AVRO records. |
7. Configure the Kafka Producer Properties in Coffee Order Service In this video, we will configure the properties to boot the Kafka producer in the coffee orders service. |
8. Build Kafka Producer to Publish the CoffeeOrder Events In this video, we will build the producer that will be used to publish the Kafka AVRO records into the Kafka topic. |
9. Build the Coffee Order Consumer In this video, we will build the consumer that will be used to consume the Kafka AVRO records from the Kafka topic. |
10. Build the PUT Endpoint for the CoffeeOrderService - PUT /coffee_orders/{id} In this video, we will build the PUT endpoint in the controller, using which we can update an existing coffee order. |