Keywords:
Introduction
Apache Kafka is one of the most popular event streaming platforms in the world. It supports both open-source and commercial options, making it a versatile choice for handling large-scale message streaming and real-time data processing. Kafka is based on the Pub/Sub (Publisher/Subscriber) model, which enables the seamless exchange of messages between producers and consumers.
In this guide, we’ll walk you through the Kafka installation process, configuration, and provide steps for Kafka topic creation, producer-consumer setup, and integration with Spring Boot.
1. Download Apache Kafka
- To get started with Kafka, visit the Apache Kafka official website and click the "DOWNLOAD KAFKA" button located in the top navigation bar.
2. Then you'll be redirected to Apache Kafka Download Page. Once on the download page, select the latest stable version of Kafka (we recommend using the most recent version) and download the binary version.
2. Install and Configure Kafka
- Extract(Unzip) the Kafka zip file you downloaded.
- Move the extracted folder(unzipped folder) to the C: drive and rename it as kafka (For example:
C:\kafka
). Important: Avoid deeply nesting the Kafka directory (don't go deeper inside the C: drive to make the URL more complex) to prevent potential errors like “The input line is too long when starting Kafka.” -
Before getting started with Kafka, a few configuration changes are required in the below property files.
- Configure
zookeeper.properties
- Navigate to
C:\kafka\config
and open thezookeeper.properties
file. - Find the
dataDir
property and update it to the following:dataDir=C:/tmp/zookeeper
- Navigate to
- Configure
server.properties
- Similarly, open the
server.properties
file in theC:\kafka\config
folder. - Locate the
log.dirs
property and change it to:log.dirs=C:/tmp/kafka-logs
- Similarly, open the
- Configure
3. Start the Kafka Server
Once Kafka is configured, it’s time to start the services. Please visit the Apache Kafka QuickStart for more information.
Step 1: Start Zookeeper Service:
Open a command prompt in the C:\kafka
directory and run the following command to start.
Command: .\bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties
Step 2: Start Kafka Broker Service:
In a seperate terminal window, still within the C:\kafka
directory, execute the following command to start the Kafka broker:
Command: .\bin\windows\kafka-server-start.bat .\config\server.properties
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMLcE130BQL21NTrojPYnbIXYGWPfTvb8JCk835Zwwi86jmRHNKw7srojfmUb4Zn6eXvzlmVxnMP15PxBcRa1wy9fjJh1HKpWeEqglIJYqzr3xk_GfvGlMpH9dnXykoaFXguVEuh18u6uHSSi7UYA2JxEXAxySL2ulYvlXm2zMT-3QKefOqjCvy3v0gP6g/w640-h350-rw/start-kafka-broker-service.webp)
Once the services are successfully started, Kafka will be running on port 9092.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjf-k1YgbF8nt8taf567kqnrirGe9it8lFupWkx4ATweo-M0VmnAVlQT8CUocb4HvmIeu4YtuyXTnhIOXQqdFdgKaS55lJaRPD7czp1BM26n-_GqKoVDAdGNdM3RxACNGuW1BaPw-Ec39raVdmODx8zt5ljO_Y_lxxeG04f9AXv6C4b6DbybyqP47eYy9K6/w640-h26-rw/kafka-port.webp)
After executing these above commands you may see that as per our above configuration a new tmp folder is created inside the C: drive and new 2 folders are generated inside the temp folder to maintain logs.
4. Create a Kafka Topic
Open a new terminal window and run the following command to create a new Kafka topic:
Command: .\bin\windows\kafka-topics.bat --create --topic myFirstTopic --bootstrap-server localhost:9092
Step 2: Create a Topic with Custom Partitions:
For more advanced configurations, you can specify the number of partitions and replication factor:
Command: .\bin\windows\kafka-topics.bat --create --topic partitioned-kafka-topic --bootstrap-server localhost:9092 --partitions 3 --replication-factor 1
5. Create Kafka Producers and Consumers
- Kafka uses the Pub/Sub (Publisher/Subscriber) pattern to handle events and message streaming process.
- To add messages on the Kafka topic we need to create a Producer (In Kafka world Producer is the word using for Publishers)which produces the message and a Consumer (In Kafka world Consumer is the word using for Subscribers)to consume that sending message by listening to that Kafka topic.
5.1 Create a Kafka Producer
To produce messages to the Kafka topic, run the following command:
Command: .\bin\windows\kafka-console-producer.bat --topic myFirstTopic --bootstrap-server localhost:9092
Now, you can enter messages, and they will be sent to the myFirstTopic
Kafka topic.
5.2 Create a Kafka Consumer
To listen for messages produced to the Kafka topic, open another terminal and run:
Command: .\bin\windows\kafka-console-consumer.bat --topic myFirstTopic --from-beginning --bootstrap-server localhost:9092
After creating both Producer and the Consumer, you may try out sending some messages to the Kafka topic from the Producer to see whether the Consumer is able to consume those messages through the Kafka Topic demonstrating real-time streaming.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8tN2TpAKhJLiq0pJojnC38Ac1vXOheWJdGBQcHHwPYBwhmQhKoytOakmGpsGM1C3yo8-sCOVHR-3tN0BxPCW6rNwo-aqlEctN_QMcroz5HdLev7Ek_kYXrPUH1Bv4Fhv4d20Z9_DZYdHnr3-4WuKhr7vlwjoBCLIWScHpdNvkC-bUzyEKg4KIDnpodU4D/w640-h30-rw/kafka-topic-new.webp)
6. Monitor Kafka Topics and Partitions
To better visualize and monitor your Kafka topics and partitions, you can use a Kafka management tool. This allows you to manage topics, consumers, and brokers through an intuitive graphical interface.
Download Kafka Tool:
Visit the following link to download Kafka Tool: Kafka Tool Download
This GUI tool will help you interact with your Kafka instance and manage your topics, partitions, and messages with ease.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhBJw9iklYXsl_hhZnk2SylQpPdn-jTmztFgQnSA-GLFsd1RqikPO7ZQ7Y7cu5oKXNMUAB9TucCUTQEQKNmglpnGofw74yg_eSu2IIjXloNxMwVCzTUUcneOafWrL9JdWbqLSzN9Z84zSEA8iYcF6VxNW05RuxVHZaMmNb-s5JtsBak1r2poLgp7THz9PIg/w640-h334-rw/monitor-kafka-topics-partitions.webp)
7. Kafka integration with a Sample Spring Boot Project
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQsf0sCHDrW-u92wFQ-7MPzwpcir3_DSGvGpi2FYmGtN4yJhY7vC0JI6huatD2o6JLBGyivAbBhmmjO-UHtkBCr50opSfa7ZsNVsSYO8wDXxMN21rBfQehU8jnslrVXQsHCctZJEZZ2_bE0bnN00PDbx9TN0H7n2LK2GPdNjX0a5zUBESPQYf0Y6EvJMp2/w640-h314-rw/kafka-integration-spring-boot.webp)
Spring Boot Kafka Producer and Consumer
For a complete example of Kafka integration in a Spring Boot project, you can refer to these GitHub repositories:
- Kafka Producer Example: Spring Boot Kafka Producer
- Kafka Consumer Example: Spring Boot Kafka Consumer
These repositories provide ready-to-use examples of both Kafka producers and consumers within a Spring Boot application.
Keywords:
0 Comments