Kafkacat Example | Eventador.io

Kafkacat example


You can use any of the available clients to interface with your cluster. In this example we will use a command line producer and consumer called Kafkacat.


Step 1: Get your endpoint

Note: make sure you have added the IP address of your machine or you will not be able to access your cluster (see “Add an ACL” in the getting started guide).

Step 2: Send some data

Now we’re ready to to send (produce) some data. In producer mode kafkacat reads messages from stdin and produces them to the provided Kafka cluster (-b) and topic (-t). For more information check the kafkacat man

First, let’s double check $BROKERS is set:

echo $BROKERS  
# should match value from plain text endpoint box

Enter the below commands:

echo '{"sensor": 1, "temp": 100}' | kafkacat -P -b $BROKERS -t demo
echo '{"sensor": 2, "temp": 101}' | kafkacat -P -b $BROKERS -t demo

Step 3: Consume some data

This time we will use kafkacat to read (consume) from the topic. In consumer mode kafkacat reads messages from a topic and prints them to stdout.

Make sure $BROKERS is still set before starting.

Enter the below command:

kafkacat -C -b $BROKERS -t demo -o beginning -e -q

You should see the two messages printed out to your terminal

{"sensor": 2, "temp": 101}
{"sensor": 1, "temp": 100}

Step 4: More testing

To test our work let’s add a few more messages. Enter the following:

echo '{"sensor": 3, "temp": 95}' | kafkacat -P -b $BROKERS -t demo
echo '{"sensor": 4, "temp": 103}' | kafkacat -P -b $BROKERS -t demo

Now let’s read the topic from the beginning again. Enter the below command:

kafkacat -C -b $BROKERS -t demo -o beginning -e -q

You should see:

{"sensor": 3, "temp": 95}
{"sensor": 2, "temp": 101}
{"sensor": 1, "temp": 100}
{"sensor": 4, "temp": 103}

Note: Your messages may print in a different order. By default, producers rotate between available partitions. Records are guaranteed to be in order within a partition but when a consumer is subscribed to a topic with multiple partitions messages may be consumed in a different order than they were sent. You can read more about this in the Kafka documentation.

Next Step

Before moving on to build a fully functional client we recommend you follow our guide to setting up your environment to automate environment configuration.

When you’re ready to move away from the command line you can start streaming to/from your applications using you Apache Kafka client of choice. Checkout the examples repo to see some of these clients in action.

Questions, feedback or find any of these steps unclear? Please contact us. We love helping people get started.