Skip to main content

Kafka Producer JAVA code

In the last post What is Apache Kafka, we discuss Kafka producer, Topics and Consumers.

In this post, I am going to provide JAVA code for writing Kafka Producer and explain how it works.

In this code, we are going to send data from CSV row by row to Kafka Topics for further consumption by Kafka Consumer. In this way, we can generate the continuous streams of Data.

1. First, you need to create a Kafka Topic. You can either do it from the console or do programmatically.

Please go through the basic Quickstart from Apache Kafka Website and create Topics, send basic message from producer and consume it at consumer.

Once, you have created Topic, its time to write Kafka Producer, which will send data to Topic.

For creating Procuder we need to configure it some parameters.

Lets, look at the configurations required for creating producer

1. List of Kafka Brokers
2. Serializer used for sending data to Kafka
3. Acknowledge from Kafka that messages are properly received.

In JAVA, we will use the properties for building the configuration.

We have created a KakfaProducer class to send data from the producer.



After that, we will be going to initialize the configuration of a file to be read (CSV) using BufferedReader class to get the header to initialize the CSV parser.





Once, we did that its time to send the CSV data to Kafka Topic on Kafka Broker. we are going to parse the CSV and send row by row iteratively from Kafka Producer to the Kafka Topic.




And Finally, the Main class to send CSV file and initialize config.




In the next post, I will be going to give JAVA code for consuming above CSV data from Topic using Consumer.

Also, check out some more posts related to Big Data, Java and Python.







Comments

Popular posts from this blog

Tricky Questions or Puzzles in C ( Updated for 2026)

Updated for 2026 This article was originally written when C/C++ puzzles were commonly asked in interviews. While such language-specific puzzles are less frequent today, the problem-solving and logical reasoning skills tested here remain highly relevant for modern Software Engineering, Data Engineering, SQL, and system design interviews . Why These Puzzles Still Matter in 2026 Although most Software &   Data Engineering interviews today focus on Programming, SQL, data pipelines, cloud platforms, and system design , interviewers still care deeply about how you think . These puzzles test: Logical reasoning Edge-case handling Understanding of execution flow Ability to reason under pressure The language may change , but the thinking patterns do not . How These Skills Apply to Data Engineering Interviews The same skills tested by C/C++ puzzles appear in modern interviews as: SQL edge cases and NULL handling Data pipeline failure scenarios Incremental vs ...

Program to uncompress a string ie a2b3c4 to aabbbcccc

Below is the program to uncompress a string #include<stdio.h> #include<conio.h> #include<stdlib.h> int main() { char str[100]="a2b3c4d8u7"; for(int i=0;str[i]!='\0';i++) { if(i%2!=0) { for(int j=0;j<atoi(&str[i]);j++) { printf("%c",str[i-1]); } } } getch(); } Want to become a Data Engineer? Check out below blog posts  1.  5 Key Skills Every Data Engineer needs in 2023 2.  How to prepare for Data Engineering Interviews 3.  Top 25 Data Engineer Questions

Programs and Puzzles in technical interviews i faced

I have attended interview of nearly 10 companies in my campus placements and sharing their experiences with you,though i did not got selected in any of the companies but i had great experience facing their interviews and it might help you as well in preparation of interviews.Here are some of the puzzles and programs asked to me in interview in some of the good companies. 1) SAP Labs I attended sap lab online test in my college through campus placements.It had 3 sections,the first one is usual aptitude questions which i would say were little tricky to solve.The second section was Programming test in which you were provided snippet of code and you have to complete the code (See Tricky Code Snippets  ).The code are from different data structures like Binary Tree, AVL Tree etc.Then the third section had questions from Database,OS and Networks.After 2-3 hours we got the result and i was shortlisted for the nest round of interviews scheduled next day.Then the next day we had PPT of t...