How to Publish Subscribe to Kafka with Spring and SASL/SCRAM

After we secure Kafka Broker and Zookeeper with SASL/SCRAM, it is time for client (Java + Spring) to connect to secured Kafka.

Sharing is caring!


As per my promise in previous post, in this article I will show from client perspective how to connect to Zookeeper and Kafka Broker with SASL/SCRAM protocol.

We have configured authentication between zookeeper and broker, as well as inter brokers. However, it is still not enough from security perspectives. We are missing authentications.

To place with security requirements, let me demonstrate a simple role based authorization for producer and consumer as well.

For this tutorial I will use, as usual, Spring Boot combine with Spring Kafka.

Use Case

First, we are going to create a topic. The topic name will be my-topic. Then we will create a producer to put message into topic, as well as a consumer to read message from the topic. Therefore, we have to define actors for the use case. Thus, we will need three personas, which are:

  1. Administrator.
    Administrator is a super user. His main responsibilities are: user management, role based access management and topic management. Administrator is represented by user admin.
    Note: user admin has already created in previous post.
  2. Producer
    Producer is the one who publish messages to topic. Alice will be the producer in this case.
  3. Consumer
    Consumer is the one who subscribe messages from topic. Bob will be the consumer in this case. And he belongs to consumer group called my-consumer-group.

Broker Configuration

First we need to update several broker properties inside /etc/kafka/

Since we want to minimize impact for existing topics, we need to set to true. This setting will allow existing topic with no ACLs configuration to continue to run as usual. After we gradually set all topics with ACL, we can safely set the value to false.

Authorization and ACLs

We need to additional users, which are alice and bob as a producer and a consumer respectively.

For configuring authorization and ACL, we create topic and give it a label as my-topic.

Next, we will set ACLs for my-topic. Alice will get producer role meanwhile Bob will get consumer roles.


During the bootstrap, spring will load and delegate org.springframework.kafka.core.KafkaAdmin to AdminClient into application context. AdminClient then will try to authenticate and connect to Kafka server. Of course, because previously we set with value true, we can safely ignore the authentication. But as I mentioned previously, in the end our target is to set this property value into false. Thus, we need to setup the security as well for KafkaAdmin using JAAS. We will create JAAS configuration file in our client host, under /var/private/jaas/jaas-spring-client.conf.

Note: sample JAAS configuration under src/main/resources folder. Username depends on which configuration you are working on. In this example, I put the configuration for producer, hence the username is alice.

Next, we set JVM properties during startup.


In producer configurations, we need to insert additional properties for SSL and SASL mechanism. And as a reminder, I put ssl.endpoint.identification.algorithm to empty string because my certificate does not contain FQDN. One more thing, I exported kafka.client.truststore.jks from previous post, and put it on client machine under folder /var/private/ssl/.


For consumer, not much different with producer configuration.

Now you can produce message, by using KafkaTemplate and consume message with @KafkaListener as usual. The producer example is under KafkaProducer class and the consumer is under TestConsumer class.


That’s it my second articles about configuring authentication and authorization in Kafka from zookeeper and broker perspectives as well as client perspectives. You can play around by changing the username, password, roles and any possibility to make sure if the RBAC is already setup as expected. Next article will talk about message encryption.

Meanwhile, the GitHub repository for this sample is on this


Author: ru rocker

I have been a professional software developer since 2004. Java, Python, NodeJS, and Go-lang are my favorite programming languages. I also have an interest in DevOps. I hold professional certifications: SCJP, SCWCD, PSM 1, AWS Solution Architect Associate, and AWS Solution Architect Professional.

2 thoughts on “How to Publish Subscribe to Kafka with Spring and SASL/SCRAM”

  1. Thanks for sharing !
    I have to add that SSL and this part contain few important misleading information. I guess you didnt check if it will work what you wrote 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *