Working with Message Streams using Apache Kafka

 

1. Introduction

Basically, Kafka is a most popular events and messages streaming platform in the world which comes in both opensource and commercial options.
It uses the Pub/Sub (Publisher/Subscriber) pattern to handle events and message streaming process.


1. Download Kafka

  1. Please go to Apache Kafka official website and then click on the DOWNLOAD KAFKA button located on the top navigation bar.



     2. Then you'll be redirected to Apache Kafka Download Page. So, please choose the latest stable version which is recommended and download its Binary version.

          


                       

2. Install/Configure Kafka

  1. Please unzip the zip file downloaded and move the unzipped folder to C: drive and rename the folder as kafka. (Note: - Please don't go deeper inside the C: drive to make the URL more complex since it may cause an error similar to "The input line is too long when starting Kafka")






  2.  Before getting start with Kafka, we need to do some small configuration level changes in the below property files.
    1. Configure zookeeper.properties
      1. Please locate inside the C:\kafka\config folder and then open zookeeper.properties file and search for dataDir property and replace with C:/tmp/zookeeper.




    2. Configure server.properties
      1. Please locate inside the C:\kafka\config folder and then open server.properties file and search for log.dirs property and replace with C:/tmp/kafka-logs.




3. Start the Kafka Server

  Please visit the Apache Kafka QuickStart page to view the below steps.

  1. Firstly, please locate inside C:\kafka folder like below screenshot and open execute the below command to Start the Zookeeper Service.

    Command: .\bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties

        



    2. Then please open another terminal from the same location and execute the below command to Start the Kafka Broker Service while keeping the above terminal open. 

       Command: .\bin\windows\kafka-server-start.bat .\config\server.properties

      




 At the end of the detailed message, you may see that Kafka Server is successfully started on port 9092.
     


3. After executing these above commands you may see that as per our above configuration a new tmp folder is created inside the C: drive and new two folders are generated inside it to maintain logs.

      



4. Create a Kafka Topic

  1. For this please open a new terminal and run the below command to create a new Kafka Topic.

            Command: .\bin\windows\kafka-topics.bat --create --topic myFirstTopic --bootstrap-server localhost:9092

             


  

     
     2. For additional options such as multiple partitions and replication-factor etc. please use below command.

         Command:  .\bin\windows\kafka-topics.bat --create --topic partitioned-kafka-topic --bootstrap-server localhost:9092 --partitions 3 --replication-factor 1

              



5. Creating Producers and Consumers

  • Kafka uses the Pub/Sub (Publisher/Subscriber) pattern to handle events and message streaming process.
  • To add messages on the Kafka topic we need to create a Producer (In Kafka world Producer is the word using for Publishers) which produces the message and a Consumer (In Kafka world Consumer is the word using for Subscribers) to consume that sending message by listening to that Kafka topic.


5.1 Create a Producer

Please execute the below command foe the above already created Kafka topic to create a new Kafka Producer to produce messages to the Kafka topic.

Command: .\bin\windows\kafka-console-producer.bat --topic myFirstTopic --bootstrap-server localhost:9092


5.2 Create a Consumer

Please execute the below command foe the above already created Kafka topic to create a new Kafka Consumer to listen the messages produced by the above producer through the Kafka topic.

Command: .\bin\windows\kafka-console-consumer.bat --topic myFirstTopic --from-beginning --bootstrap-server localhost:9092


After creating both Producer and the Consumer, you may try out sending some messages to the Kafka topic from the Producer to see whether the Consumer is able to consume those messages through the Kafka Topic at real time like below.



6. Monitor Kafka Topics/Partitions

Please download the below tool to have a better visual representation of Kafka topics/partitions etc. through a GUI.

Tool URL: https://www.kafkatool.com/download.html





7. Kafka integration with a Sample Spring Boot Project


           



Please find the below attached sample Spring boot Project as an example for Kafka integration with the postman collection.

GitHub Link for Kafka Producer: https://github.com/punsaraprathibha/kafka-producer-springboot
GitHub Link for Kafka Consumer: https://github.com/punsaraprathibha/kafka-consumer-springboot


Post a Comment

0 Comments