Build a Flink project

Flink Projects


Projects allow developers to easily integrate existing SDLC workflows into a Flink project via Github. You check in your project and Eventador handles all the complexity of the build and deploy process. If you have a specific integration process or branching strategies all those workflows apply directly to your Flink code, and Flink deployments on Eventador.

Prerequisites

  • Account linked with GitHub.
  • If you already have an account, you can link with GitHub by visiting your profile and clicking the “Link this Account to GitHub” button.
  • Eventador + Flink deployment. To create a deployment, follow this guide.
  • At least two topics. This tutorial will use intput and output created in the getting started guide but the framework is flexible enough you can use any topics you like.

Create a Flink project

To create a project:

  • From the projects tab click “create project”
  • Select “Java – Read From Kafka, Write to Kafka” Flink template. These templates serve to teach as well as reduce time to market. In future projects you can start with a blank template if you wish. Templates are simply Github repositories – Eventador imports them when you create a project. A template includes all the components to build an entire Flink job – typically in Java. It’s a complete project with all the scaffolding and files required. You focus on the core logic, we’ll cover the rest.
  • Fill in the rest of the form and click create.
  • This template is boilerplate plus a simple flink job which takes in a message from one topic (intput) and posts it to another topic (output). Of course this is not very useful yet but it is a good starting point for more complex Flink job that maps and filters incoming messages.

Build and Deploy a Flink project

Now we will build and deploy your project. Select your new project from the projects page.

  • The template code found here takes several command line arguments. We allow you to input these arguments straight from the web UI. Replace READ_TOPIC and WRITE_TOPIC with input/output (the topics we created earlier).
  • $EVENTADOR_KAFKA_BROKERS is a dynamic argument that will automatically detect your broker endpoint. You could also use the actual plain text endpoint found in your Kafka deployment connection tab e.g xxxxxxxx-kafka0.pub.va.eventador.io:9092.

If you are deploying the FlinkReadWriteKafka template. Your form should look something like this:

Monitor your Flink Job

Switch over to the deployments tab and click “Apache Flink” box. You should see your Flink job running. The Eventador Console has an integrated view of the Flink cluster, showing jobs, the execution plan, the state and more. This gives you full control over running jobs, and the overall topology of the cluster. The Flink UI is built into the Eventador Console – so it has the same team level access controls as the rest of the platform.

See Flink in action

Since we know any message posted to input should be read and forwarded to output, we can use kafkacat to post a message then read from output to test our setup. If you have not used kafkacat before, you may want to start with this guide.

First, set your BROKERS environment variable BROKERS=<value from connections tab> e.g. BROKERS= xxxxxxxx-kafka0.pub.va.eventador.io:9092. This is covered further in the kafkacat guide.

Now we’ll validate output topic is empty (or see what’s already there).

Ok, lets post a message to input topic.

And now we read output again to validate message was forwarded on by flink.

Indeed! It looks like Flink read our message to input and posted it to output. Of course this is a simple example but it lays foundation for a lot of awesome pipelines. Instead of forwarding every message you could hook into a log stream and only send alerts if you notice something awry or monitor thousands of warehouse sensors temperature rising above a threshold, the possibilities are endless.

Next Steps

To learn more about Flink you can check the flink docs. Here is a demo of Flink in action.

Please contact us if you experience any issues following this guide or have questions about Flink. We love helping people get started.

Leave a Reply

Your email address will not be published. Required fields are marked *