Overview
In this blog I’ll show an example on how to run a very simple roundtrip Spring boot app which will send a message to a Kafka topic and will consume it, printing the message to the console.
TLDR;
The code for this example can be found on GitHub
Before you start…
Before running the Spring Boot app or the integration test, you should setup your environment and start a local Kafka broker, as detailed here. The broker is started in the background using Kraft, instead of Zookeeper, which is on the sunsetting path. There’s a script in the code that allows you to kill the background process once done. Please exercise care in executing this script as it kills the background process in an uncontrolled way with -9.
What to expect from the example
The example is a Spring Boot web app which includes the Kafka starter. Details of what Spring has to offer when it comes to Kafka can be found here.
The application starts a Servlet Container listening on port 8080 which exposes the following URL: http://localhost:8080/api/v1/kafka/publish
When hitting the URL, the following JSON file is sent to the Spring MVC controller:
{
"firstName": "First Name",
"lastName": "Last Name"
}
This type has been defined in the Spring app as a DTO:

The controller is very simple: it receives the POST request, creates a Party DTO and invokes the Kafka producer service to send the payload to the first_topic Kafka topic, returning a 200 (OK) with some description.

Once the message has been sent to the topic, it is consumed by the Kafka consumer service, which prints the output to the console:

The dependencies to use Spring Kafka support are really easy (here I use Maven but you can use Gradle or whatever build tool):

The spring-kafka dependency allows us to use the KafkaTemplate class in the producer and the @KafkaListener for the consumer, which you will agree makes the job of producing and consuming events really easy.
Upon hitting the local URL mentioned above, the event payload is printed to the console:

How does Spring know about Kafka?
The Kafka configuration is in the application.properties file:

Here we define the broker URL, the offset reset and a String and JSON serialiser / deserialiser. As you know Kafka messages are always stored as binaries. When we send a different type of payload to Kafka we need to serialise it from the source type to binary and when we want to consume a message from Kafka we need to deserialise it from a binary type to the target type. Since we send JSON as event payload the configuration of the above Serialisers / Deserialisers allows us to do that.
Doing this without Spring Boot would require many more lines of code
Running the integration test
The example also comes with an integration test (requires you to have the local Kafka instance setup and running as per instructions above).
The test simply simulates a POST request to the Spring Boot controller and verifies that the response is what is expected.

The test makes use of the excellent support for automation testing offered by Spring Boot.
Using Conduktor as a Kafka client
Conduktor is a free and great UI to manage your Kafka cluster. Download and installation instructions can be found here. If you’re using Mac and have installed brew, you can also install Conduktor with the following commands:
brew tap conduktor/brew
brew install conduktor
I hope you enjoyed this post.
Leave a Reply