Getting Started with Kafka
Getting Started with Kafka
Eventador offers several plans ranging from development/testing plans to full production grade enterprise configurations. This guide will show you how to create a Kafka developer deployment, which is free for 30 days. If you’d like to deploy Kafka + Flink plan use this guide.
Eventador is a high performance real-time data pipeline platform based on Apache Kafka. Eventador makes it easy to perform analysis and build applications using real time streaming data. Some areas that can benefit from real time data are sensor networks, IoT, click-stream analysis, fraud detection, or anything that requires real-time data. Eventador is deployed to Amazon AWS, and delivered as a service.
A developer deployment consists of: – A Kafka cluster including Zookeeper nodes (the backbone of the service) – Ability to create up to 5 topics – Access to the Eventador console to monitor and configure your deployment. – For more details visit our plans page.
Each account has an isolated VPC that your deployments live in. You must grant access to each deployment via the console in order for any traffic to be allowed through. More on this below.
- Create an account or login via GitHub.
- Enter credit card info on the accounts page (required for production account and developer plans after trial period)
Step 1: Create a Deployment
- After creating an account, click the Deployments tab from the Eventador Console.
Kafka Developer Plan. Note: If you want to create Kafka + Flink make sure you follow this guide.
- Name the deployment. For this example we will use the name
- You will see a status bar as your deployment is provisioned.
Step 2: Add an ACL
- Once provisioning is complete you will be reminded to add an ACL. An ACL is how you set access permissions for your cluster. By default, your cluster will be closed to the outside world (including you).
- Click the link to add an ACL. You can add/modify this list later by clicking the security button on the right side of your newly created cluster. This is an important step so pay close attention to the on screen help.
Step 3: Create a Topic (optional)
A topic is a container to hold a stream of related data.
- From the Deployments tab inside your new cluster you should see a box containing the words
Apache Kafkawith a green check next to it. Click this box.
- You should see an existing topic
defaultsink. This is the default topic created during provisioning.
- To create a new topic click
- Name the topic
demoand set the number of partitions to 16. Replication factor is set to 1 for development cluster but can be increased in production deployments.
- Click create
Congratulations! You have setup a new Apache Kafka cluster. Now that your topic is created you’re ready to start sending data.
Next: Send data with kafkacat
Questions, feedback or find any of these steps unclear? Please contact us. We love helping people get started.