Apache Kafka in Spring Boot Application
Apache Kafka in Spring Boot Application
Choosing the right messaging system during your architectural planning is always a
challenge, yet one of the most important considerations to nail. As a developer, I write
applications daily that need to serve lots of users and process huge amounts of data in
real time.
Usually, I use Java with the Spring Framework (Spring Boot, Spring Data, Spring Cloud,
Spring Caching, etc.) for this. Spring Boot is a framework that allows me to go through
my development process much faster and easier than before. It has come to play a
crucial role in my organization. As the number of our users quickly grew, we realized our
apparent need for something that could process as many as 1,000,000 events per
second.
When we found Apache Kafka®, we saw that it met our needs and could handle millions
of messages quickly. That’s why we decided to try it. And since that moment, Kafka has
been a vital tool in my pocket. Why did I choose it, you ask?
Scalable
Fault tolerant
A great publish-subscribe messaging system
Capable of higher throughput compared with most messaging systems
Highly durable
Highly reliable
High performant
I recommend using the Confluent CLI for your development to have Apache Kafka and
other components of a streaming platform up and running.
Table of contents
Step 1: Generate our project
Step 2: Publish/read messages from the Kafka topic
Step 3: Configure Kafka through application.yml configuration file
Step 4: Create a producer
Step 5: Create a consumer
Step 6: Create a REST controller
Start by creating a simple Java class, which we will use for our example: package
com.demo.models;
Copy
public class User {
private String name;
this.name = name;
this.age = age;
}
Step 3: Configure Kafka
through application.yml configuration file
Next, we need to create the configuration file. We need to somehow configure our Kafka
producer and consumer to be able to publish and read messages to and from the topic.
Instead of creating a Java class, marking it with @Configuration annotation, we can use
either application.properties file or application.yml. Spring Boot allows us to avoid all the
boilerplate code we used to write in the past, and provide us with much more intelligent
way of configuring our application, like this:
Copy
server: port: 9000
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
group-id: group_id
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
producer:
bootstrap-servers: localhost:9092
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
If you want to get more about Spring Boot auto-configuration, you can read this short
and useful article. For a full list of available configuration properties, you can refer to the
official documentation.
Copy
@Service
public class Producer {
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;
We just auto-wired KafkaTemplate and will use this instance to publish messages to the
topic—that’s it for producer!
Copy
@Service
public class Consumer {
Here, we told our method void consume (String message) to subscribe to the user’s
topic and just emit every message to the application log. In your real application, you
can handle messages the way your business requires you to.
To fully show how everything that we created works, we need to create a controller with
single endpoint. The message will be published to this endpoint, and then handled by
our producer.
Then, our consumer will catch and handle it the way we set it up by logging to the
console.
Copy
@RestController
@RequestMapping(value = "/kafka")
public class KafkaController {
@Autowired
KafkaController(Producer producer) {
this.producer = producer;
}
@PostMapping(value = "/publish")
public void sendMessageToKafkaTopic(@RequestParam("message") String message) {
this.producer.sendMessage(message);
}
}
Copy
curl -X POST -F 'message=test' http://localhost:9000/kafka/publish
Basically, that’s it! In fewer than 10 steps, you learned how easy it is to add Apache
Kafka to your Spring Boot project. If you followed this guide, you now know how to
integrate Kafka into your Spring Boot project, and you are ready to go with this super
tool!
Learn more
To learn more about using Spring Boot with Apache Kafka, check out this free
course with expert videos and guides.
You can also sign up for Confluent Cloud, a fully managed event streaming platform
powered by Apache Kafka, and use the promo code SPRING200 for an additional $200
of free Confluent Cloud usage.*