top of page
Search

Real-Time Messaging Made Easy with Kafka, Docker & Python

  • Writer: Mohammed  Juyel Haque
    Mohammed Juyel Haque
  • Apr 6
  • 2 min read

Apache Kafka is a distributed streaming platform used for building real-time data pipelines and streaming applications. In this blog, we’ll walk through a simple Kafka setup using Docker, and test it using Python. This is perfect for beginners who want to get up and running quickly with minimal setup.



Step 1: Setup Kafka Using Docker

We'll use Docker Compose to spin up Zookeeper and a Kafka Broker using Confluent’s official Docker images.


Create the docker-compose.yml

Create a new file named docker-compose.yml and paste the following content:

version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - 2181:2181

  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

Start Kafka

Run the following command to start the Kafka and Zookeeper services:

docker-compose up -d

Verify

Check if the containers are running:

docker ps

You should see both zookeeper and kafka running.


Step 2: Create a Kafka Topic

Exec into Kafka Container

docker exec -it <container_id_or_name> bash

Replace <container_id_or_name> with your Kafka container ID or name.


Create a Topic

Inside the container, run:

kafka-topics --create --topic juyel-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

List Topics

kafka-topics --list --bootstrap-server localhost:9092

You should see test-topic listed.


Step 3: Python Code to Test Kafka

I will use the kafka-python library to send and receive messages from Kafka.

Install kafka-python

pip install kafka-python

Producer Code: send_message.py

from kafka import KafkaProducer

producer = KafkaProducer(bootstrap_servers='localhost:9092')

topic = 'juyel-topic'
for i in range(10):
    message = f'Hello Kafka {i}'
    producer.send(topic, value=message.encode('utf-8'))
    print(f"Sent: {message}")

producer.flush()

Consumer Code: read_message.py

from kafka import KafkaConsumer

consumer = KafkaConsumer(
    'juyel-topic',
    bootstrap_servers='localhost:9092',
    auto_offset_reset='earliest',
    enable_auto_commit=True,
    group_id='my-group'
)

print("Reading messages from Kafka:")
for message in consumer:
    print(f"Received: {message.value.decode('utf-8')}")

Step 4: Test It All

Start the Consumer

python read_message.py

In Another Terminal, Start the Producer

python send_message.py

You’ll see messages being sent by the producer and instantly received by the consumer. Real-time messaging in action!


Cleanup

To stop all services:

docker-compose down

Final Thoughts

That’s it! You’ve successfully:

  • Deployed Kafka using Docker

  • Created a topic

  • Sent and received messages using Python

This setup is great for development and testing Kafka-based pipelines. Next, I will come with more advanced features like Kafka Streams, Kafka Connect, or integrating with databases.


 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating*

© 2024 Mohammed Juyel Haque. All rights reserved.

bottom of page