We couldn't find any listings for your search. Explore our online options below.
Know someone teaching this? Help them become an Educator on Cademy.
Learn core Apache Kafka features along with creating Java, Node.js and Python producers and consumers
Explore the Apache Kafka ecosystem and architecture, and learn client API programming in Java
A beginner's guide to mastering real-time stream processing using Apache Kafka and Kafka Streams API
Through this course, you will learn how to arrange Kafka a producer and consumer and Kafka Streams and Connectors accurately. You will also gain the skills needed to coordinate Kafka with existing application stages and to pass the Apache Kafka certification exam.
Duration 2 Days 12 CPD hours This course is intended for This in an Introductory and beyond level course is geared for experienced Java developers seeking to be proficient in Apache Kafka. Attendees should be experienced developers who are comfortable with Java, and have reasonable experience working with databases. Overview Working in a hands-on learning environment, students will explore Overview of Streaming technologies Kafka concepts and architecture Programming using Kafka API Kafka Streams Monitoring Kafka Tuning / Troubleshooting Kafka Apache Kafka is a real-time data pipeline processor. It high-scalability, fault tolerance, execution speed, and fluid integrations are some of the key hallmarks that make it an integral part of many Enterprise Data architectures. In this lab intensive two day course, students will learn how to use Kafka to build streaming solutions. Introduction to Streaming Systems Fast data Streaming architecture Lambda architecture Message queues Streaming processors Introduction to Kafka Architecture Comparing Kafka with other queue systems (JMS / MQ) Kaka concepts : Messages, Topics, Partitions, Brokers, Producers, commit logs Kafka & Zookeeper Producing messages Consuming messages (Consumers, Consumer Groups) Message retention Scaling Kafka Programming With Kafka Configuration parameters Producer API (Sending messages to Kafka) Consumer API (consuming messages from Kafka) Commits , Offsets, Seeking Schema with Avro Kafka Streams Streams overview and architecture Streams use cases and comparison with other platforms Learning Kafka Streaming concepts (KStream, KTable, KStore) KStreaming operations (transformations, filters, joins, aggregations) Administering Kafka Hardware / Software requirements Deploying Kafka Configuration of brokers / topics / partitions / producers / consumers Security: How secure Kafka cluster, and secure client communications (SASL, Kerberos) Monitoring : monitoring tools Capacity Planning : estimating usage and demand Trouble shooting : failure scenarios and recovery Monitoring and Instrumenting Kafka Monitoring Kafka Instrumenting with Metrics library Instrument Kafka applications and monitor their performance
A beginner-level course that follows a step-by-step approach to learning the fundamentals and core concepts of Apache Kafka 3.0. You will work through interesting activities such as programming a Twitter producer and Elasticsearch consumer to understand the various concepts.
Get hands-on with Kafka monitoring setup with Prometheus and Grafana, Kafka operations and Kafka cluster upgrades Setup in AWS.
Learn the fundamentals and advanced concepts of Apache Kafka in this course. This course will give you a good understanding of all the concepts through hands-on practice.
This course brings together all the important topics related to modern distributed applications and systems in one place. Explore the common challenges that appear while designing and implementing large-scale distributed systems, and how big-tech companies solve those problems. Throughout the course, we are going to build a distributed URL shortening service.
In this course, you will learn to create Kafka Streams microservices using the Spring cloud framework. This is an example-driven course, and you will learn to use Confluent Kafka distribution for all the examples. By the end of this course, you will learn to create Kafka Streams microservices using different types of serializations, Confluent schema registry, and creating stateless and stateful event processing applications.