Feeding JSON Data to Kafka Topic Using Rest Proxy
Last Updated :
10 May, 2025
This POC describes the procedure of feeding JSON format data to a Kafka Topic using Kafka REST Proxy, that which provides a RESTful interface to a Kafka cluster.
Prerequisites:
Before you start this procedure, ensure that :
- Administrative access to running Kafka VM and that VM must have connectivity as described in the loading Prerequisites.
- Identify and note the Zoo-Keeper hostname and port.
- Identify and note the hostname and port of the Kafka broker(s).
- Identify and note the hostname and port of the Kafka Rest Proxy.
Note: This procedure assumes that you have installed the Apache Kafka distribution. If you are using a different Kafka distribution, you may need to adjust certain commands in the procedure.
Here, In this Use case, we have configured hostnames and ports with the following
- rest-proxy localhost: 8082
- zookeeper localhost: 2182
- bootstrap-server localhost: 9095
Procedure to feed JSON data to Kafka Topic:
Step 1: Log in to a host in your Kafka VM.
$ cd kafka_2.12-2.4.0 /*if this directory does not exit, Use ls command to view the folder and copy/paste the existing folder*/
To list out, all the topics which are present inside that kafka topics, use the below cmd
$ bin/kafka-topics.sh --list --zookeeper localhost:2182 /*To check/verify and to display all the topics*/

Step 2: Create a Kafka topic. Here, create a topic named "topic-test-1" with a single partition and only one replica:
For example:
$ bin/kafka-topics.sh --create --zookeeper localhost:2182 --replication-factor 1 --partitions 1 --topic topic-test-1

$ bin/kafka-topics.sh --list --zookeeper localhost:2182 /*To verify or to list out the created topic*/

Step 3: Create a JSON file. Create a file named sample- json-data.json in the editor of your choice.
For example:
$ vi sample-json-data.json
then, paste some json format text and add it to a file, and then save the file and exit
For example:
{
"first_name": "Tom",
"last_name": "Cruze",
"email": "[email protected]",
"gender": "Male",
"ip_address": "1.2.3.4"
}

Step 4: To stream the contents of the json file to a Kafka console producer
$ bin/kafka-console-producer.sh --broker-list localhost:9095 --topic topic-test-1 < sample-json-data.json

Step 5: To verify that the Kafka console producer published the messages to the topic by running a Kafka console consumer
$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9095 --topic topic-test-1 --from-beginning

Step 6: To stream the contents of the other JSON file to a Kafka console producer
For example:
$ vi sample.json
then, paste some JSON format text and add it to a file, and then save the file and exit
{ "cust_id": 1313131, "month": 12, "expenses": 1313.13 }
{ "cust_id": 3535353, "month": 11, "expenses": 761.35 }
{ "cust_id": 7979797, "month": 10, "expenses": 4489.00 }
{ "cust_id": 7979797, "month": 11, "expenses": 18.72 }
{ "cust_id": 3535353, "month": 10, "expenses": 6001.94 }
{ "cust_id": 7979797, "month": 12, "expenses": 173.18 }
{ "cust_id": 1313131, "month": 10, "expenses": 492.83 }
{ "cust_id": 3535353, "month": 12, "expenses": 81.12 }
{ "cust_id": 1313131, "month": 11, "expenses": 368.27 }

Kafka REST Proxy:
The Kafka REST Proxy provides a RESTful interface to a Kafka cluster. It makes it easy to produce and consume messages, view the state of the cluster, and perform administrative actions without using the native Kafka protocol or clients.
To get the list of topics using curl
$ curl "http://localhost:8082/topics"
To get the info of one topic
$ curl http://localhost:8082/topics/<menction topic name>
For example:
$ curl "http://localhost:8082/topics/topic-test-1"
Step 1: To Produce a message using JSON with a value to a topic
For example, to produce a message using JSON with a value '{ "month": 12}' to the topic topic-test-1
$ curl -X POST -H "Content-Type: application/vnd.kafka.json.v2+json" \
-H "Accept: application/vnd.kafka.v2+json" \
--data '{"records":[{"value":{"month": 12}}]}' "http://localhost:8082/topics/topic-test-1"
/*Expected output from preceding command*/
{
"offsets":[{"partition":0,"offset":16,"error_code":null,"error":null}],"key_schema_id":null,"value_schema_id":null
}
To verify that the Kafka console producer published the messages to the topic by running a Kafka console consumer

$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9095 --topic topic-test-1 --from-beginning
Step 2: Create a consumer for JSON data, starting at the beginning of the topic's
$ curl -X POST -H "Content-Type: application/vnd.kafka.v2+json" \
--data '{"name": "my_consumer_instance", "format": "json", "auto.offset.reset": "earliest"}' \ http://localhost:8082/consumers/my_json_consumer
/* Expected output from preceding command*/
{
"instance_id":"my_consumer_instance",
"base_uri":"http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance"
OR
"base_uri":"http://rest-proxy:8082/consumers/my_json_consumer/instances/my_consumer_instance"
}

Step 3: Log and subscribe to a topic.
$ curl -X POST -H "Content-Type: application/vnd.kafka.v2+json" --data '{"topics":["topic-test-1"]}' \
http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance/subscription
/* Expected output from preceding command*/
# No content in response

Step 4: To consume some data using the base URL in the first response.
$ curl -X GET -H "Accept: application/vnd.kafka.json.v2+json" \
http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance/records

Optional Steps:
Step 1: Finally, to close the consumer with a DELETE to make it leave the group and clean up its resources.
$ curl -X DELETE -H "Content-Type: application/vnd.kafka.v2+json" \
http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance
/* Expected output from preceding command*/
# No content in response
Step 2: verify the consumer instance using the following command
$ curl -X GET -H "Accept: application/vnd.kafka.json.v2+json" \
http://localhost:8082/consumers/my_json_consumer/instances/my_consumer_instance/records
/* Expected output from preceding command*/
{ “error_code”: 40403, “message”: “Consumer instance not found.” }
Similar Reads
Apache Kafka - Create High Throughput Producer using Java
Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically rec
4 min read
Apache Kafka Load Testing Using JMeter
Apache Kafka is designed as a key component with real-time data flows and event-driven scheduling to accelerate data flow to applications. In this article, we will explore how to incorporate JMeter into Apache Kafka tests but understand what it does before we begin the main contents. Producer: In Ka
5 min read
Apache Kafka - Create Safe Producer using Java
Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically rec
5 min read
How to get all Topics in Apache Kafka?
Apache Kafka is an open-source event streaming platform that is used to build real-time data pipelines and also to build streaming applications. Kafka is specially designed to handle a large amount of data in a scalable way. In this article, we will learn how to get all topics in Apache Kafka. Steps
2 min read
Apache Kafka - Real World Project with Twitter using Java
Apache Kafka is a publish-subscribe messaging system. A messaging system let you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. To know more about this, please refer to the article
5 min read
How to Create an Apache Kafka Project in IntelliJ using Java and Maven?
Apache Kafka allows you to decouple your data streams and systems. So the idea is that the source systems will be responsible for sending their data into Apache Kafka. Then any target systems that want to get access to this data feed this data stream will have to query and read from Apache Kafka to
3 min read
Spring Boot | How to consume JSON messages using Apache Kafka
Apache Kafka is a stream processing system that lets you send messages between processes, applications, and servers. In this article, we will see how to publish JSON messages on the console of a Spring boot application using Apache Kafka. In order to learn how to create a Spring Boot project, refer
3 min read
Understanding In-Sync Replicas (ISR) in Apache Kafka
Apache Kafka, a distributed streaming platform, relies on a robust replication mechanism to ensure data durability and availability. Central to this mechanism is the concept of In-Sync Replicas (ISR). Understanding ISR is crucial for anyone working with Kafka, as it directly impacts data consistency
4 min read
Sending Message to Kafka Using Java
Apache Kafka is an open-source real-time data handling platform. It is used to transmit messages between two microservices. The sender is referred to as the Producer, who produces the messages, and the receiver is referred to as the Consumer, who consumes the messages. In this setup, Kafka acts as a
4 min read
Publish Message to Kafka Using Java
Apache Kafka is amongst the popular message brokers being used in the industry. Kafka works on a publish-subscribe model and can be used to establish async communication between microservices. We are here going to discuss how to publish messages to Kafka topic using a Java application. There are som
6 min read