Open In App

How to Connect to Kafka Using Java

Last Updated : 02 Jul, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

Apache Kafka is open source real time data handling platform. It is a distributed event store which is used for low latency and high volume of data. Kafka is a message queue and it can be used between microservices to communicate or passing the message. In kafka architecture we have four major components:

  • Kafka Brokers: Server in Kafka cluster is said to be broker. Each broker can receive the messages from producer and sends the message to all the consumers.
  • Kafka ZooKeeper: The coordination between kafka cluster and broker is done by ZooKeeper. It will notify to broker if any change occurs in cluster and vice versa.
  • Producer: Work of producer is to produce the Message. Means Producer will send the message to the Broker.
  • Consumer: While the consumers work is to consumes the messages from the cluster.

Using kafka we can built a High throughput and fault tolerable microservice. For connecting kafka with java we will be requiring two things first Producer and then Consumer. Producer will produces the message or high volume data to send to another microservice or to other Database. But before that we should to have Kafka installed into our system.

Downloading Kafka

Before connecting the kafka with Java we want to download the kafka into our system. There are two ways to use kafka into our system.

  • Download the Kafka server and zookeeper.
  • Use Docker Image.

For Downloading you can refer installation guide for windows, for ubuntu.

Installing Kafka with Docker

Now for working with Docker first check whether Docker is installed or not. You can check the version of Docker is below command.

docker --version

If Docker is not installed you can download Docker. Now once Docker is installed let's start the procedure to install kafka.

Step 1: Pull the Image

First we will pull the image from the docker using below command.

docker pull apache/kafka:3.7.0

Check for the latest version and then download it accordingly. Once kafka image is pulled from docker run the same command to check whether the image is pulled successfully.

Image-up-to-date
Pull Image

Running Kafka with Docker

Now every thing is installed we are good to run the kafka. Again we have to ways to execute the kafka server.

1. Run kafka with Docker Desktop

If you have docker desktop you can go to images section and then run the kafka image.

Docker-Desktop
Run From Docker

2. Run Kafka with Command

We can run the kafka using below command.

docker run -p 9092:9092 apache/kafka:3.7.0

Now kafka is in running state. It will run on port 9092.

Kafka-started
Run using Command

Error Handling and Logging

While Connecting with server or receiving the message we can fall into some Exceptions. But in real time we cannot go without Exception handling. And for better understanding after exception we will also implement logging into our application. We will manage the dependency for logging.

In kafka we can have many Exception related to connectivity, serialisation or deserialization. Also we can have exceptions for Sending the message or network error. Let's see the Exception and the Classes which java supports for Exception Handling.

  1. SerializationException - Errors in serializing or deserializing messages. This generally occurs when we have Mismatched serializer configuration.
  2. TimeoutException/NetworkException: This will occur at producer when we try sending message to kafka server and due to any network issue if we unable to send the message then we have such kind of error.
  3. CommitFailedException: Errors during offset commit in consumers.

For logger we are going to use slf4j logger and for logging the message we have to write two lines.

private static final Logger logger = LoggerFactory.getLogger(KafkaExample.class);
logger.error(message, exception);

Using this two lines we can logs our exceptions now let's create a Application and Connects Java with kafka.

Create Producer/Consumer class in Java with Dependencies

For creating a Java application we will follow some steps.

Step 1: Create a Java Application

We will create a Spring Boot Application with Maven.

Java Application
Java Application

Step 2: Add Dependencies

In pom.xml we will install dependencies for Kafka and slf4j logger.

XML
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>Kafka</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>17</maven.compiler.source>
        <maven.compiler.target>17</maven.compiler.target>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>3.7.0</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>2.0.13</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-simple</artifactId>
            <version>2.0.13</version>
        </dependency>
    </dependencies>

</project>

Step 3: Create a class For Producer

Now we create a class in which we will set some properties for like servers, Key type and Value type. And Using this we will send a simple message to server. But with Producer we can send message or Data to other Service. We can post the Data into the database. As kafka gives us the ability to send the data whenever the other entity is available.

Java
import org.apache.kafka.clients.producer.*;
import org.apache.kafka.common.serialization.StringSerializer;

import java.util.Properties;

public class Main {
    public static void main(String[] args) {
        // Set up the producer properties
        Properties props = new Properties();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        Producer<String, String> producer = null;

        try {
            // Create the producer
            producer = new KafkaProducer<>(props);

            // Create a producer record
            ProducerRecord<String, String> record = new ProducerRecord<>("Messages", "key", "Hello, Kafka!");

            // Send the record with a callback to handle exceptions
            producer.send(record, new Callback() {
                @Override
                public void onCompletion(RecordMetadata metadata, Exception exception) {
                    if (exception != null) {
                        System.err.println("Error sending record: " + exception.getMessage());
                        exception.printStackTrace();
                    } else {
                        System.out.println("Record sent successfully to topic " + metadata.topic() +
                                " partition " + metadata.partition() + " at offset " + metadata.offset());
                    }
                }
            });

        } catch (Exception e) {
            System.err.println("Error creating or sending producer: " + e.getMessage());
            e.printStackTrace();
        } finally {
            // Close the producer
            if (producer != null) {
                try {
                    producer.close();
                } catch (Exception e) {
                    System.err.println("Error closing producer: " + e.getMessage());
                    e.printStackTrace();
                }
            }
        }
    }
}

Step 4: Create a class For Consumer

Now we create a class in which we will set some properties for like servers, Key type and Value type. In this we will also going to writing code for consumer than we will also mention Group Id. Consumer can be the microservice or the application which only handles the databases. In the below code we will create a loop which will runs for infinite time. But in real time we will create a configuration in java which will listen the server and whenever the message receives it will call some function.

Java
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.serialization.StringDeserializer;

import java.time.Duration;
import java.util.Collections;
import java.util.Properties;

public class Main {
    public static void main(String[] args) {
        // Set up the consumer properties
        Properties props = new Properties();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "my-group-id");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());

        Consumer<String, String> consumer = null;

        try {
            // Create the consumer
            consumer = new KafkaConsumer<>(props);

            // Subscribe to the topic
            consumer.subscribe(Collections.singletonList("Messages"));

            // Continuously poll for new messages
            while (true) {
                try {
                    ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
                    for (ConsumerRecord<String, String> record : records) {
                        System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
                    }
                } catch (Exception e) {
                    System.err.println("Error during poll: " + e.getMessage());
                    e.printStackTrace();
                }
            }
        } catch (Exception e) {
            System.err.println("Error creating or using consumer: " + e.getMessage());
            e.printStackTrace();
        } finally {
            // Close the consumer
            if (consumer != null) {
                try {
                    consumer.close();
                } catch (Exception e) {
                    System.err.println("Error closing consumer: " + e.getMessage());
                    e.printStackTrace();
                }
            }
        }
    }
}

Output

Conclusion

So, by this we can connect Java with kafka. Using kafka we can sent different message in different topics between so many microservices. Also this is just a Generic code for connection for more you can see how to create consumer or how to create producer articles.


Next Article
Article Tags :

Similar Reads