How to Connect to Kafka Using Java
Last Updated :
02 Jul, 2024
Apache Kafka is open source real time data handling platform. It is a distributed event store which is used for low latency and high volume of data. Kafka is a message queue and it can be used between microservices to communicate or passing the message. In kafka architecture we have four major components:
- Kafka Brokers: Server in Kafka cluster is said to be broker. Each broker can receive the messages from producer and sends the message to all the consumers.
- Kafka ZooKeeper: The coordination between kafka cluster and broker is done by ZooKeeper. It will notify to broker if any change occurs in cluster and vice versa.
- Producer: Work of producer is to produce the Message. Means Producer will send the message to the Broker.
- Consumer: While the consumers work is to consumes the messages from the cluster.
Using kafka we can built a High throughput and fault tolerable microservice. For connecting kafka with java we will be requiring two things first Producer and then Consumer. Producer will produces the message or high volume data to send to another microservice or to other Database. But before that we should to have Kafka installed into our system.
Downloading Kafka
Before connecting the kafka with Java we want to download the kafka into our system. There are two ways to use kafka into our system.
- Download the Kafka server and zookeeper.
- Use Docker Image.
For Downloading you can refer installation guide for windows, for ubuntu.
Installing Kafka with Docker
Now for working with Docker first check whether Docker is installed or not. You can check the version of Docker is below command.
docker --version
If Docker is not installed you can download Docker. Now once Docker is installed let's start the procedure to install kafka.
Step 1: Pull the Image
First we will pull the image from the docker using below command.
docker pull apache/kafka:3.7.0
Check for the latest version and then download it accordingly. Once kafka image is pulled from docker run the same command to check whether the image is pulled successfully.
Pull ImageRunning Kafka with Docker
Now every thing is installed we are good to run the kafka. Again we have to ways to execute the kafka server.
1. Run kafka with Docker Desktop
If you have docker desktop you can go to images section and then run the kafka image.
Run From Docker2. Run Kafka with Command
We can run the kafka using below command.
docker run -p 9092:9092 apache/kafka:3.7.0
Now kafka is in running state. It will run on port 9092.
Run using CommandError Handling and Logging
While Connecting with server or receiving the message we can fall into some Exceptions. But in real time we cannot go without Exception handling. And for better understanding after exception we will also implement logging into our application. We will manage the dependency for logging.
In kafka we can have many Exception related to connectivity, serialisation or deserialization. Also we can have exceptions for Sending the message or network error. Let's see the Exception and the Classes which java supports for Exception Handling.
- SerializationException - Errors in serializing or deserializing messages. This generally occurs when we have Mismatched serializer configuration.
- TimeoutException/NetworkException: This will occur at producer when we try sending message to kafka server and due to any network issue if we unable to send the message then we have such kind of error.
- CommitFailedException: Errors during offset commit in consumers.
For logger we are going to use slf4j logger and for logging the message we have to write two lines.
private static final Logger logger = LoggerFactory.getLogger(KafkaExample.class);
logger.error(message, exception);
Using this two lines we can logs our exceptions now let's create a Application and Connects Java with kafka.
Create Producer/Consumer class in Java with Dependencies
For creating a Java application we will follow some steps.
Step 1: Create a Java Application
We will create a Spring Boot Application with Maven.
Java ApplicationStep 2: Add Dependencies
In pom.xml we will install dependencies for Kafka and slf4j logger.
XML
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>Kafka</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.0.13</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.13</version>
</dependency>
</dependencies>
</project>
Step 3: Create a class For Producer
Now we create a class in which we will set some properties for like servers, Key type and Value type. And Using this we will send a simple message to server. But with Producer we can send message or Data to other Service. We can post the Data into the database. As kafka gives us the ability to send the data whenever the other entity is available.
Java
import org.apache.kafka.clients.producer.*;
import org.apache.kafka.common.serialization.StringSerializer;
import java.util.Properties;
public class Main {
public static void main(String[] args) {
// Set up the producer properties
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
Producer<String, String> producer = null;
try {
// Create the producer
producer = new KafkaProducer<>(props);
// Create a producer record
ProducerRecord<String, String> record = new ProducerRecord<>("Messages", "key", "Hello, Kafka!");
// Send the record with a callback to handle exceptions
producer.send(record, new Callback() {
@Override
public void onCompletion(RecordMetadata metadata, Exception exception) {
if (exception != null) {
System.err.println("Error sending record: " + exception.getMessage());
exception.printStackTrace();
} else {
System.out.println("Record sent successfully to topic " + metadata.topic() +
" partition " + metadata.partition() + " at offset " + metadata.offset());
}
}
});
} catch (Exception e) {
System.err.println("Error creating or sending producer: " + e.getMessage());
e.printStackTrace();
} finally {
// Close the producer
if (producer != null) {
try {
producer.close();
} catch (Exception e) {
System.err.println("Error closing producer: " + e.getMessage());
e.printStackTrace();
}
}
}
}
}
Step 4: Create a class For Consumer
Now we create a class in which we will set some properties for like servers, Key type and Value type. In this we will also going to writing code for consumer than we will also mention Group Id. Consumer can be the microservice or the application which only handles the databases. In the below code we will create a loop which will runs for infinite time. But in real time we will create a configuration in java which will listen the server and whenever the message receives it will call some function.
Java
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.serialization.StringDeserializer;
import java.time.Duration;
import java.util.Collections;
import java.util.Properties;
public class Main {
public static void main(String[] args) {
// Set up the consumer properties
Properties props = new Properties();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "my-group-id");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
Consumer<String, String> consumer = null;
try {
// Create the consumer
consumer = new KafkaConsumer<>(props);
// Subscribe to the topic
consumer.subscribe(Collections.singletonList("Messages"));
// Continuously poll for new messages
while (true) {
try {
ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
for (ConsumerRecord<String, String> record : records) {
System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
}
} catch (Exception e) {
System.err.println("Error during poll: " + e.getMessage());
e.printStackTrace();
}
}
} catch (Exception e) {
System.err.println("Error creating or using consumer: " + e.getMessage());
e.printStackTrace();
} finally {
// Close the consumer
if (consumer != null) {
try {
consumer.close();
} catch (Exception e) {
System.err.println("Error closing consumer: " + e.getMessage());
e.printStackTrace();
}
}
}
}
}
Output
Conclusion
So, by this we can connect Java with kafka. Using kafka we can sent different message in different topics between so many microservices. Also this is just a Generic code for connection for more you can see how to create consumer or how to create producer articles.
Similar Reads
How To Connect To Kafka Running In Docker ?
Docker and Kafka are two important tools used in today's digital world. Docker makes an easy deployment process by encapsulating applications into containers while Kafka provides a distributed event-streaming platform for building real-time data pipelines and data streaming applications. Here in thi
5 min read
How to connect MySQL database using Scala?
MySQL database connectivity using ScalaIntroduction:Since Scala is interoperable with Java, we can directly work with databases using JDBC. JDBC - Java DataBase Connectivity is a Java API that allows Java code or programs to interact with the database.Scala code blocks or programs similarly use thes
3 min read
How to get all Topics in Apache Kafka?
Apache Kafka is an open-source event streaming platform that is used to build real-time data pipelines and also to build streaming applications. Kafka is specially designed to handle a large amount of data in a scalable way. In this article, we will learn how to get all topics in Apache Kafka. Steps
2 min read
Sending Message to Kafka Using Java
Apache Kafka is an open-source real-time data handling platform. It is used to transmit messages between two microservices. The sender is referred to as the Producer, who produces the messages, and the receiver is referred to as the Consumer, who consumes the messages. In this setup, Kafka acts as a
4 min read
Publish Message to Kafka Using Java
Apache Kafka is amongst the popular message brokers being used in the industry. Kafka works on a publish-subscribe model and can be used to establish async communication between microservices. We are here going to discuss how to publish messages to Kafka topic using a Java application. There are som
6 min read
Apache Kafka - Create Producer with Keys using Java
Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and Ho
6 min read
How To Create EC2 Instances Using SDK For JAVA ?
The AWS SDK for Java provides various functionalities to use AWS services using APIs. It provides support for building Java applications easily with the help of the SDK. Using the SDK makes it easier to procure AWS services directly from Java code. Creating and provisioning EC2 instances from Java i
5 min read
Apache Kafka Load Testing Using JMeter
Apache Kafka is designed as a key component with real-time data flows and event-driven scheduling to accelerate data flow to applications. In this article, we will explore how to incorporate JMeter into Apache Kafka tests but understand what it does before we begin the main contents. Producer: In Ka
5 min read
How To Setup Kafka on Docker?
The Apache Kafka is an open source stream processing platform developed by the Apache Software Foundation written in Scala and Java Programming. And It is designed to handle real time data feeds with high throughput, low latency, and durability. The Kafka is used to build real time data pipelines an
4 min read
How To Install Apache Kafka on Mac OS?
Setting up Apache Kafka on your Mac OS operating system framework opens up many opportunities for taking care of continuous data and data streams effectively. Whether you're a designer, IT specialist, or tech geek, figuring out how to introduce Kafka locally permits you to examine and construct appl
4 min read