CentralMesh.io

Kafka Fundamentals for Beginners
AdSense Banner (728x90)

5.1 Kafka API Introduction

Overview of Kafka's Producer and Consumer APIs.

Video Coming Soon

Kafka API Introduction

Overview

This chapter focuses on working directly with Kafka APIs - the developer-centric approach to building Kafka applications. Understanding these APIs allows you to build custom data pipelines and integrate Kafka into your applications.

Why Learn Kafka APIs?

Kafka APIs enable:

  • Building custom data pipelines
  • Integrating Kafka into applications
  • Flexible streaming solutions
  • Real-time data processing
  • Enterprise-grade scalability

Core APIs

Producer API

The publishing side of Kafka's publish-subscribe model. Sends messages to Kafka topics - how data enters Kafka from your applications.

Consumer API

The reading side that retrieves messages from topics. Enables data flow out for processing, essential for real-time applications and data processing systems.

Environment Setup

Install Java

Windows: Download JDK from Oracle, set JAVA_HOME environment variable

Mac: Install via Homebrew, set JAVA_HOME in shell profile

bash
1brew install openjdk
2export JAVA_HOME=$(/usr/libexec/java_home)

Linux: Install via package manager

bash
1sudo apt update
2sudo apt install openjdk-11-jdk
3export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64

Install Gradle

  1. Download from Gradle website
  2. Extract to directory
  3. Set GRADLE_HOME environment variable
  4. Add Gradle's bin directory to PATH

    Mac/Linux:

    bash
    1export GRADLE_HOME=/opt/gradle/gradle-7.x
    2export PATH=$PATH:$GRADLE_HOME/bin

    Windows:

    • Create GRADLE_HOME system variable
    • Add %GRADLE_HOME%\bin to PATH

    Set Up Kafka with Docker

    Create docker-compose.yaml defining:

    • Zookeeper
    • Multiple Kafka brokers (for realistic multi-broker setup)

    Start the cluster:

    bash
    1docker-compose up -d

    Create a Topic

    Create a 'payment' topic:

    bash
    1kafka-topics.sh --create --topic payment \\
    2  --bootstrap-server localhost:9092 \\
    3  --partitions 2 \\
    4  --replication-factor 3

    This ensures:

    • High availability through replication
    • Better parallelism through partitions

Summary

This lesson established the foundation for working with Kafka APIs:

  • Installed Java and Gradle
  • Configured Kafka with Docker
  • Created test topics
  • Ready to build Kafka applications

Next lessons will dive into Producer and Consumer API implementation.