Kafka roundtrip with Spring Boot

In this blog I show how to produce and consume events to/from Kafka via a Spring Boot application


In this blog I’ll show an example on how to run a very simple roundtrip Spring boot app which will send a message to a Kafka topic and will consume it, printing the message to the console.


The code for this example can be found on GitHub

Before you start…

Before running the Spring Boot app or the integration test, you should setup your environment and start a local Kafka broker, as detailed here. The broker is started in the background using Kraft, instead of Zookeeper, which is on the sunsetting path. There’s a script in the code that allows you to kill the background process once done. Please exercise care in executing this script as it kills the background process in an uncontrolled way with -9.

What to expect from the example

The example is a Spring Boot web app which includes the Kafka starter. Details of what Spring has to offer when it comes to Kafka can be found here.

The application starts a Servlet Container listening on port 8080 which exposes the following URL: http://localhost:8080/api/v1/kafka/publish

When hitting the URL, the following JSON file is sent to the Spring MVC controller:

  "firstName": "First Name",
  "lastName": "Last Name"

This type has been defined in the Spring app as a DTO:

Party class represents the event payload type. Rest of the code omitted for brevity

The controller is very simple: it receives the POST request, creates a Party DTO and invokes the Kafka producer service to send the payload to the first_topic Kafka topic, returning a 200 (OK) with some description.

Spring Boot Kafka producer

Once the message has been sent to the topic, it is consumed by the Kafka consumer service, which prints the output to the console:

Spring Boot Kafka consumer

The dependencies to use Spring Kafka support are really easy (here I use Maven but you can use Gradle or whatever build tool):

Adding Kafka support in Spring Boot

The spring-kafka dependency allows us to use the KafkaTemplate class in the producer and the @KafkaListener for the consumer, which you will agree makes the job of producing and consuming events really easy.

Upon hitting the local URL mentioned above, the event payload is printed to the console:

Console output

How does Spring know about Kafka?

The Kafka configuration is in the file: file

Here we define the broker URL, the offset reset and a String and JSON serialiser / deserialiser. As you know Kafka messages are always stored as binaries. When we send a different type of payload to Kafka we need to serialise it from the source type to binary and when we want to consume a message from Kafka we need to deserialise it from a binary type to the target type. Since we send JSON as event payload the configuration of the above Serialisers / Deserialisers allows us to do that.

Doing this without Spring Boot would require many more lines of code

Running the integration test

The example also comes with an integration test (requires you to have the local Kafka instance setup and running as per instructions above).

The test simply simulates a POST request to the Spring Boot controller and verifies that the response is what is expected.

The integration test

The test makes use of the excellent support for automation testing offered by Spring Boot.

Using Conduktor as a Kafka client

Conduktor is a free and great UI to manage your Kafka cluster. Download and installation instructions can be found here. If you’re using Mac and have installed brew, you can also install Conduktor with the following commands:

brew tap conduktor/brew
brew install conduktor

I hope you enjoyed this post.

By Marco Tedone

I lead the API, Integration and Microservices (AIM) Practice for a world’s leading international bank. My team is in charge of defining and maintaining standards, best practices and accelerators for everything that relates to APIs, Integration and Microservices. We also act as an internal consultancy helping other teams with AIM adoption, architecture reviews, pair programming and technology choices. We are chief problem solvers. I also specialise in building high performing technology teams. I have experience and a passion for Lean Enterprise transformations with the goal of helping organisations to deliver business value faster by looking at product delivery as business value delivery and as a system flow, where BDD, Agile, DevOps, Testing Automation, Portfolio and Budget Management, Regulatory and Compliance, Security and NFRs are all parts of a single journey. My favourite execution tool for Lean Enterprise transformations is the Improvement/Coaching Kata, by Mike Rother. I’m well versed in Followership, Leadership and Coaching.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: