Python Examples

Python Examples


Kafka-Python is a Python client Apache Kafka distributed stream processing system. Kafka-python is designed to function much like the official java client, with a sprinkling of pythonic interfaces (e.g., consumer iterators).


  • You will need an Eventador deployment and topic. You can follow the steps in the getting started guide to create one.
  • Install kafka-python via pip pip install kafka-python

Step 1: Get your endpoint

  • From the Eventador console, under deployments, click your Apache Kafka cluster.
  • Switch to the Connections tab.
  • Copy the value in Plain Text Endpoint.
  • You will use this value in Step 2

    Note: make sure you have added the IP address of your machine or you will not be able to access your cluster (see “Add an ACL” in the getting started guide).

Step 2: Create a producer

A producer is a Kafka client that publishes records to the Kafka cluster. KafkaProducer has many different configurations which you can read about here. For this example we will stick with a basic producer.

  • Create a file
  • Let’s start by adding global KAFKA_TOPIC and KAFKA_BROKER values. These values will be used to tell our producer how to connect to our cluster.
  • KAFKA_TOPIC should be the topic you created in the getting started guide. In this case we will use the demo topic.

Tip: If you have a single broker (developer plan) this will be a single endpoint ending in If you have a plan with multiple brokers, you can add additional endpoints separated by a comma. Ex: 'endpoint0:9092,endpoint1:9092,endpoint2:9092'

Now we add a the producer. You will need to import KafkaProducer

Step 3: Send some messages

Run this script python

This will publish these three messages to your topic.

Tip: Kafka clients work with bytes. If your messages are not already bytes you can use value_serializer to tell the client how to encode your message.

Example producer that handles json messages

Step 4: Create a consumer

Now we want to create a consumer so we can read back our messages. Consumers can be added at any time. In most production applications you will have N consumers subscribed to a topic. These messages will be consumed in real time as they are produced. A key feature of Kafka is the ability to read a topic starting at the earliest stored message. Any new consumer can read all historic messages when they connect. We will use that feature now.

  • Create a new file
  • We will start with connection values just as we did in the producer setup.

This time we’ll import KafkaConsumer

auto_offset_reset tells the consumer to start reading from the start of the topic. If left blank it will default to current offset (only new messages).

Step 5: Read the messages:

Run the script python

If there are other messages in your topic, those will be print out as well. Once the consumer is caught up it will sit there waiting for new messages to be sent to the subscribed topic. If you want to see your consumer in action you can open a new terminal and re-run the New messages will show up real in real time.

You can view the code used for this tutorial in the tutorials folder of our examples repository.

Questions, feedback or find any of these steps unclear? Please contact us. We love helping people get started.

Leave a Reply

Your email address will not be published. Required fields are marked *