life is too short for a diary




Publish / Subscribe to Kafka locally

Tags: kafka kcat docker

Author
Written by: Tushar Sharma
Featured image for Publish / Subscribe to Kafka locally

Apache Kafka is a distributed streaming platform designed for building real-time data pipelines and streaming applications. Originally developed by LinkedIn, Kafka excels at handling high-throughput, fault-tolerant publish-subscribe messaging between systems.

Core Concepts

Before diving into the setup, let's understand Kafka's fundamental concepts:

Local Development Setup

For local development, we'll use Docker Compose to orchestrate Kafka and its dependencies. This approach provides isolated, reproducible environments without cluttering your system.

Docker Compose Configuration

Our setup includes three services:

  1. Zookeeper - Manages cluster coordination and configuration
  2. Kafka - The main message broker
  3. Kafka-init - Initializes topics and waits for Kafka readiness

Configuration Breakdown

Zookeeper Service

Kafka Service

Key environment variables explained:

⚠️ Important: ADVERTISED_LISTENERS must match how you'll access Kafka. Use localhost:9092 for local development.

Kafka-init Service

This utility container:

Starting the Environment

docker compose up

Wait for all services to be ready. You should see logs indicating Kafka is accepting connections and the test-topic has been created.


comments powered by Disqus