Apache Kafka - Create Consumers in a Consumer Group using CLI
Last Updated :
01 Feb, 2023
In Apache Kafka, a consumer group is a set of consumers that work together to consume a topic. A set of partitions is given to each group member consumer to choose from. When a consumer in the group finishes consuming a message from its assigned partition, it sends a message to the Kafka broker to commit the offset (the position of the consumer in the partition). In this way, the consumer group is able to track its progress through the partition and pick up where it left off if a consumer fails or the group needs to scale up or down. This makes it easy to build fault-tolerant, scalable, and distributed applications with Kafka. Here are some key points to remember about consumer groups in Kafka:
- One or more consumers can be part of a consumer group.
- A consumer group can consume multiple topics.
- Each partition in a topic can only be consumed by a single consumer in a group.
- Each consumer in a group is responsible for consuming a unique set of partitions.
- Offsets are committed to Kafka on a per-consumer basis, not per-consumer-group. This means that if a consumer fails, the other consumers in the group can pick up where the failed consumer left off and continue consuming from the committed offset.
Example of Creating a Consumer Group
It is not possible to have more consumers in a group than there are partitions in the Kafka topic. Therefore, you must first create a Kafka topic with a sufficient number of partitions. In the example, the topic has 3 partitions
kafka-topics.sh --bootstrap-server localhost:9092 --topic my-topic --create --partitions 3 --replication-factor 1
Next, start a consumer in a consumer group called 'my-first-application'.
kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-topic --group my-first-application
In a new terminal window, start a second consumer in the 'my-first-application' consumer group using the same command as the first consumer.
kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-topic --group my-first-application
Open a new terminal window and start a third consumer in the 'my-first-application' consumer group.
kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-topic --group my-first-application
Note: The same command is being used to create multiple consumers in the same consumer group.
Each consumer in the 'my-first-application' consumer group will be assigned a partition. Send a few string messages to the topic.
Produce some messages to my-topic.Each consumer will only display the messages produced on the partition that has been assigned to it.
Kafka Consumer group demo.If you stop one of the consumers, the messages will be automatically sent to the remaining consumers because consumer groups automatically perform a consumer rebalance when a consumer is stopped
Stop all the consumers, and then when a consumer in the group is restarted, it will start consuming messages from the latest committed offsets and will only process messages that have been produced since the consumer was stopped.
Here are a few things to remember:
- If you use the --group option to consume data in a consumer group, the --from-beginning option will be ignored. To reset the consumer group to the beginning of the topic, you will need to use the methods described here.
- If you do not specify a --group option, the consumer will be part of a random consumer group, such as console-consumer-11984.
- If you notice that only one consumer is receiving all the messages, it is likely that the topic was created with only one partition. You can verify this using the kafka-topics --describe command.
Here are some advanced parameters that you can use when creating a Kafka consumer group:
- --bootstrap-server: the host and port of at least one Kafka broker in the cluster.
- --topic: the name of the topic that the consumer group will consume from.
- --group: the name of the consumer group.
- --from-beginning: start consuming messages from the beginning of the topic, rather than the current offset.
- --max-messages: the maximum number of messages to consume before exiting.
- --auto-offset-reset: specifies what to do when there are no committed offsets for the consumer group. Valid values are earliest (start consuming from the earliest offset), latest (start consuming from the latest offset), and none (throw an exception if there are no committed offsets).
- --partition: the partition to consume from. If not specified, the consumer will consume from all partitions in the topic.
- --property: specify a configuration property in the form name=value.
Here is an example of using some of these options to create a consumer group:
kafka-console-consumer --bootstrap-server localhost:9092 --topic my-topic --group my-first-application --from-beginning --auto-offset-reset earliest --property print.key=true --property key.separator=,
This will create a consumer group called my-first-application that consumes from the my-topic topic, starts consuming from the earliest offset, and sets the print.key and key.separator properties.
Similar Reads
Apache Kafka - Create Consumer using Java
Kafka Consumer is used to reading data from a topic and remember a topic again is identified by its name. So the consumers are smart enough and they will know which broker to read from and which partitions to read from. And in case of broker failures, the consumers know how to recover and this is ag
6 min read
How to Create Apache Kafka Consumer with Conduktor?
Kafka Consumers is used to reading data from a topic and remember a topic again is identified by its name. So the consumers are smart enough and they will know which broker to read from and which partitions to read from. And in case of broker failures, the consumers know how to recover and this is a
3 min read
Spring Boot - Create and Configure Topics in Apache Kafka
Topics are a special and essential component of Apache Kafka that are used to organize events or messages. In other words, Kafka Topics enable simple data transmission and reception across Kafka Servers by acting as Virtual Groups or Logs that store messages and events in a logical sequence. In this
2 min read
Apache Kafka - Consumer Seek and Assign using Java
Kafka Consumer is used to reading data from a topic and remember a topic again identified by its name. So the consumers are smart enough and will know which broker to read from and which partitions to read from. And in case of broker failures, the consumers know how to recover and this is again a go
5 min read
Apache Kafka - Create Consumer with Threads using Java
Threads are a subprocess with lightweight with the smallest unit of processes and also have separate paths of execution. These threads use shared memory but they act independently hence if there is an exception in threads that do not affect the working of other threads despite them sharing the same
8 min read
How Kafka Consumer and Deserializers Work in Apache Kafka?
Kafka Consumers is used to reading data from a topic and remember a topic again is identified by its name. So the consumers are smart enough and they will know which broker to read from and which partitions to read from. And in case of broker failures, the consumers know how to recover and this is a
2 min read
Spring Boot | How to consume JSON messages using Apache Kafka
Apache Kafka is a stream processing system that lets you send messages between processes, applications, and servers. In this article, we will see how to publish JSON messages on the console of a Spring boot application using Apache Kafka. In order to learn how to create a Spring Boot project, refer
3 min read
How to Install & Configure Conduktor Tool For Apache Kafka?
Conduktor is a full-featured native desktop application that plugs directly into Apache Kafka to bring visibility to the management of Kafka clusters, applications, and microservices. Itâs eventually helping companies make the most of their existing engineering resources, and minimizing the need for
2 min read
How to Create Apache Kafka Producer with Conduktor?
Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically recover fr
4 min read
Spring Boot | How to consume string messages using Apache Kafka
Apache Kafka is a publish-subscribe messaging queue used for real-time streams of data. A messaging queue lets you send messages between processes, applications, and servers. In this article we will see how to send string messages from apache kafka to the console of a spring boot application. Appro
3 min read