Kafka SASL Authentication

February 20, 2018 in Product News

Kafka SASL Authentication

Core support for Simple Authentication and Security Layer (SASL) was added to Apache Kafka in the 0.10.2 release. This allows for simple username/password authentication to Kafka using SASL. We are excited to add this authentication mechanism to the Eventador service. Here is how it works.


Kafka SASL SCRAM support started life as KIP-84 and grew into KAFKA-3751, ultimately making it into 0.10.2. Recently, we released Kafka 1.0.0 onto our platform then followed up by adding support for SASL/SCRAM. While KAFKA 3751 made it possible to use this authentication mechanism, it’s still a hassle and confusing at best. In releasing this feature to our console, we wanted to make using a simple username and password to authenticate to Kafka simple, yet keep all the power and security.

SASL on Eventador

SASL is a key component of the security configuration of your Kafka deployment. At Eventador, we previously enabled you to white-list consumers and producers via our deployment scoped ACL controls and encrypt communications via SSL.

Now we have added the last piece to the picture – the capability to use SASL/SCRAM for authentication to Eventador Kafka deployments. Eventador handles creating the CA certificate, and all user account management (CRUD operations). This saves you the hassle of configuring and managing users via the default Kafka scripts.

Creating and managing users

Creating users is simple. Log into the Eventador Console, and select the Kafka deployment you want to add users to, then select the ‘SASL users’ tab. Add users by clicking on the ‘add user’ button. You can add/remove as you need to. Also, there is the ability to change/reset passwords.

An end-to-end SASL example

It should be noted that this point many drivers support SASL with various levels of maturity, so you may want to check compatibility before you dive in too deep – YMMV. In our example, we are going to use Scala.

First, let’s set up an environment, and create our keyfile. We will be pasting the key from the Eventador Console into the keyfile. To grab your key, navigate to your deployment here, then select configure for the deployment you want to produce to, then the tab labeled ‘SASL Users’. Substitute <paste cert here> with the text from the box labelled ‘Deployment CA Certificate’.

# setup environment
docker run -it --rm ubuntu:16.04
apt update && apt install -y scala wget curl vim jq kafkacat
cd /root
wget http://central.maven.org/maven2/org/apache/kafka/kafka-clients/1.0.0/kafka-clients-1.0.0.jar
wget http://central.maven.org/maven2/org/slf4j/slf4j-api/1.7.25/slf4j-api-1.7.25.jar
# create cert
mkdir -p /opt/eventador
cat > /opt/eventador/CA.pem < < EOF
<paste cert here>
# create key
keytool -keystore /opt/eventador/ev.truststore.jks -alias CARoot -import -file /opt/eventador/CA.pem -storepass eventadorCApassword -noprompt

In this case, we will use Scala. In the same docker container start Scala:

 scala -classpath "*.jar"

 Let's produce some events. You need to substitute ```<username>``` with your username and ```<password>``` with your password. You will also need to substitute ```<sasl_ssl Endpoint>``` with your endpoint from the Eventador Console.

import java.util.Properties
import scala.collection.JavaConverters._
import org.apache.kafka.clients.producer.{ProducerConfig, ProducerRecord, KafkaProducer}
import org.apache.kafka.common.serialization.StringSerializer

class Producer(brokers: String, topic: String) {

  val producer = new KafkaProducer[String, String](configuration)

  private def configuration: Properties = {
    val sasl_username = <username>"
    val sasl_password = <password>"

    val jaasTemplate = "org.apache.kafka.common.security.scram.ScramLoginModule required username=\"%s\" password=\"%s\";"
    val jaasConfig = String.format(jaasTemplate, sasl_username, sasl_password)

    val props = new Properties()
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers)
    // Serializers for producers, Deserializers for consumers
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, classOf[StringSerializer].getCanonicalName)
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, classOf[StringSerializer].getCanonicalName)
    props.put("ssl.truststore.type", "jks")
    props.put("ssl.truststore.location", "/opt/eventador/ev.truststore.jks")
    props.put("ssl.truststore.password", "eventadorCApassword")
    props.put("security.protocol", "SASL_SSL")
    props.put("sasl.mechanism", "SCRAM-SHA-256")
    props.put("sasl.jaas.config", jaasConfig)

  def produceMessages(): Unit = {
    while (true) {
      // Build a fresh message so we can see a new message with changing data
      val epoch = System.currentTimeMillis()
      val msgFmt = "{\"date\": \"%s\", \"action\": \"Scala Producer Test\", \"epoch.ms\": %s}"
      val recordMsg = String.format(msgFmt, new java.util.Date, s"$epoch")
      // Create a Kafka record, using just topic and message value means round-robin partitioning
      // Constructor summary available: https://kafka.apache.org/10/javadoc/org/apache/kafka/clients/producer/ProducerRecord.html
      val record = new ProducerRecord[String, String](topic, recordMsg)
      // Sleep .1 second between message sends

println("Kafka Producer started.")
val producer = new Producer(brokers = <sasl_ssl Endpoint>", topic = "defaultsink")
println("Kafka Producer created - start publishing messages via produceMessages()")

Now you should be producing messages, you can monitor the activity in your dashboard or use kafkacat to pop off a few messages. Here are the docs for client configs. As always if you have any questions or are having problems with this example hit support.

Leave a Reply

Your email address will not be published. Required fields are marked *