Apache Kafka Fundamentals LiveLessons

Apache Kafka Fundamentals LiveLessons

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 3h 49m | 664 MB

Almost 4 hours of video instruction to help you learn the key concepts and features behind Kafka and how it can be used to solve many problems related to the transfer of data.

Apache Kafka Fundamentals LiveLessons provides a complete overview of Kafka and Kafka- related topics. The course begins with a general overview of Kafka and then dives into use cases and the design of Kafka. Learners are introduced to the concept of clusters as well as how to create producers and consumers. Additional topics, such as security, connectors, and streams, are covered. The course finishes with a discussion of how Kafka can be used with Big Data software.

With detailed, hands-on code examples, provided with the intent to translate to a coding language of your choosing, learners are walked through multiple scenarios involving projects from Twitter, Netflix, Yahoo!, and other real-world scenarios.

Topics include

  • Kafka concepts
  • Use cases
  • Kafka design
  • API overview
  • Installation and configuration
  • Clusters
  • Writing producers
  • Writing consumers
  • Kafka operations
  • Connectors
  • Streams

What You Will Learn

  • A complete overview of Kafka
  • How to install and configure Kafka
  • How Kafka fits into the big picture of Big Data

Lesson 1, “Kafka Concepts”: You learn the essentials of what Kafka is, it’s history, and some of the key concepts of the Kafka solution. We compare Kafka to other potential solutions and point out the major advantages of using it.

Lesson 2, “Use Cases”: In this lesson you see how Kafka is used in real-world scenarios using projects from Twitter, Netflix, and Yahoo!

Lesson 3, “Kafka Design”: In this lesson you learn the design principals of Kafka. You start by learning key features and concepts, including the Kafka APIs, topics, logs, producers, and consumers. You also learn different ways in which Kafka can be used, including as a message system, a storage system, and for stream processing.

Lesson 4, “API Overview”: This lesson explores some basics of the producer and consumer APIs. The goal of this lesson is to start exploring the Java libraries that are used while creating Kafka applications.

Lesson 5, “Installation and Configuration”: As its names suggests, this lesson covers how to install and configure Kafka. It starts by discussing hardware and operating system considerations. It then covers the installation of Zookeeper, a separate software tool that is needed for Kafka to perform correctly. Finally, it covers how to install Kafka and perform some basic configuration operations.

Lesson 6, “Clusters”: Clusters are a key component of Kafka. The purpose and replication of clusters are discussed in this lesson. You also see a demonstration of the configuration of clusters.

Lesson 7, “Writing Producers”: This lesson explains the concepts and code for creating producers. The lesson focuses on how producers communicate with Kafka and how to handle serialization.

Lesson 8, “Writing Consumers”: This lesson discusses the concepts and code for creating consumers. The lesson starts with key concepts of consumers, including consumer groups, and then discusses how to create a consumer and subscribe to topics. The lesson also covers the concepts of polling, commits, rebalancing listeners, and deserializers.

Lesson 9, “Kafka Operations”: This lesson focuses on several features of Kafka that can be used to modify the manner in which Kafka performs, and shows you how you can modify them. It starts with how to administer topics and then jumps into a discussion on balancing and mirroring. Bandwidth and quotas are also covered.

Lesson 10, “Connectors”: Kafka Connect is a newer feature of Kafka that provides an easy way to pull in data from non- Kafka sources. The goal of this lesson is to provide a good overview of what connectors are and discuss some basic configuration of Kafka Connect, transformations, and the Connect REST API interface.

Lesson 11, “Streams”: Streams are a newer feature of Kafka that provide an easy way to pull in data from non-Kafka sources. This is a rather large topic, so the goal of this lesson is to provide a good overview of what streams are. In addition to covering the basics concepts of streams, we discuss how streams might be implemented in a system that also uses Kafka Connect.

Table of Contents

01 Apache Kafka Fundamentals LiveLessons – Introduction
02 Learning objectives
03 1.1 Understanding Messaging
04 1.2 Origin – The LinkedIn Story
05 1.3 The Kafka Solution
06 1.4 Kafka Advantages
07 Learning objectives
08 2.1 Overview of Use Cases
09 2.2 Use Case #1
10 2.3 Use Case #2
11 2.4 Use Case #3
12 Learning objectives
13 3.1 Distributed Streaming Platform
14 3.2 Overview of APIs
15 3.3 Topics and Partitions
16 3.4 Producers and Consumers
17 3.5 Kafka as a Messaging System
18 3.6 Kafka as a Storage System
19 3.7 Kafka for Stream Processing
20 Learning objectives
21 4.1 The Producer API
22 4.2 The Consumer API
23 Learning objectives
24 5.1 Hardware Considerations
25 5.2 Operating System and Java Installation
26 5.3 Zookeeper Installation
27 5.4 Kafka Broker Installation
28 5.5 Topics Configurations
29 5.6 Lab Exercise
30 Learning objectives
31 6.1 What Are Kafka Clusters
32 6.2 Replication
33 Learning objectives
34 7.1 Producer Configuration
35 7.2 Constructing Producers
36 7.3 Communicating with Kafka
37 7.4 Serializers
38 7.5 Partitions
39 Learning objectives
40 8.1 Understanding Consumers
41 8.2 Consumer Groups
42 8.3 Consumer Configuration
43 8.4 Subscribing to Topics
44 8.5 Polls
45 8.6 Commits
46 8.7 Rebalancing Listeners
47 8.8 Deserializers
48 Learning objectives
49 9.1 Topic Administration
50 9.2 Balancing and Mirroring
51 9.3 Bandwidth and Quotas
52 Learning objectives
53 10.1 Overview
54 10.2 Configuration
55 10.3 Transformations
56 10.4 REST API
57 Learning objectives
58 11.1 Overview
59 11.2 Concepts and Architecture
60 Apache Kafka Fundamentals LiveLessons – Summary