Spring Batch is a lightweight yet robust framework designed for batch processing, the automated execution of large data tasks without human intervention. It provides reusable components for logging, transaction management, job scheduling, retries and error handling.
When integrated with Spring Boot, it simplifies batch job configuration and execution, allowing developers to focus on the business logic instead of boilerplate setup.
What is Batch Processing
Batch processing refers to executing repetitive, data-intensive tasks in bulk. Typical examples include:
- Processing large datasets
- Database migration
- Generating reports
- ETL (Extract, Transform, Load) operations
Spring Batch is purpose-built for such use cases by splitting jobs into smaller, manageable steps that can run sequentially or in parallel.
Jobs, Steps and Flow
A Job in Spring Batch represents the complete batch process, while Steps define the logical phases within that job.
- Job: Encapsulates the full batch process, consisting of multiple steps.
- Step: Represents one stage of a job — typically involves reading, processing and writing data.
- Flow: Defines the execution order of steps. You can create conditional or parallel flows (e.g., Step 2 runs only if Step 1 succeeds).
Each step operates in three distinct phases: ItemReader, ItemProcessor and ItemWriter.
Core Components of Spring Batch
1. ItemReader
Reads input data from a source such as a database, file, or message queue. It reads one record at a time and passes it to the processor.
public class StringReader implements ItemReader<String> {
private String[] data = {"Spring", "Batch", "Example"};
private int index = 0;
@Override
public String read() {
return index < data.length ? data[index++] : null;
}
}
2. ItemProcessor
Applies business logic or transformation on each item read by the reader.
public class StringProcessor implements ItemProcessor<String, String> {
@Override
public String process(String item) {
return item.toUpperCase(); // Transform text to uppercase
}
}
3. ItemWriter
Writes the processed data to the desired output, such as a database or console.
public class ConsoleWriter implements ItemWriter<String> {
@Override
public void write(List<? extends String> items) {
for (String item : items) {
System.out.println(item);
}
}
}
Chunk-Oriented Processing
Spring Batch processes data in chunks, not all at once.
Each step reads and processes individual items, but commits them in groups defined by a chunk size, improving both performance and transaction management.
stepBuilderFactory.get("step")
.<String, String>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
In this example:
- 10 items are read and processed.
- Once the chunk limit is reached, all 10 items are written in a single transaction.
Job Repository and Metadata
The Job Repository maintains execution metadata for jobs and steps, including:
- JobInstance: Represents a unique execution configuration.
- JobExecution: Tracks job runs, including status and timestamps.
- StepExecution: Records details of each step execution.
This allows restartability (resume from failure point) and monitoring of batch executions. A relational database (e.g., MySQL, HSQLDB) typically stores this metadata.
Transaction Management and Error Handling
Spring Batch ensures transactional integrity — if a step fails, its changes can be rolled back.
Error Handling Strategies:
- Retry: Automatically retry failed steps.
- Skip: Ignore certain failed records.
- Listeners: Run custom logic before or after steps.
.step("step")
.<String, String>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.faultTolerant()
.retry(Exception.class)
.retryLimit(3)
.build();
Scheduling Batch Jobs
You can schedule jobs using Spring’s @Scheduled annotation or tools like Quartz.:
@EnableScheduling
public class BatchScheduler {
@Autowired
private JobLauncher jobLauncher;
@Autowired
private Job job;
@Scheduled(cron = "0 0 12 * * ?") // Runs every day at noon
public void runJob() throws Exception {
JobParameters parameters = new JobParametersBuilder()
.addLong("time", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(job, parameters);
}
}
Example Project: Spring Boot with Spring Batch
This project reads data from a CSV file, processes it and writes it to a MySQL database.
Step 1: Create a New Spring Boot Project
Project Name: spring-batch-example
Type: Maven
Packaging: Jar
Dependencies:
- Spring Batch
- Spring Data JPA
- Spring Boot DevTools
- Lombok
- MySQL Driver
Project Structure:
After project creation done, the folder structure will be like below:

Step 2: Configure application.properties
spring.application.name=spring-batch-example
# Database Configuration
spring.datasource.url=jdbc:mysql://localhost:3306/spring_batch_db?useSSL=false&serverTimezone=UTC
spring.datasource.username=root
spring.datasource.password=mypassword
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
# Hibernate Settings
spring.jpa.hibernate.ddl-auto=update
spring.jpa.show-sql=true
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQLDialect
# Batch Settings
spring.batch.job.enabled=true
spring.batch.initialize-schema=always
server.port=8080
Step 3: Create CSV Data File
Create data.csv in src/main/resources/:
firstName,lastName
Mahesh ,Kadambala
Ravi,Teja
Lakshmi,Narayana
Praveen,Chowdary
Kiran ,Kumar
Saneep,Kumar
Akhil,Hero
Gautam,P
Madhavo ,Reddy
Suresh,Kumar
Ravi,Teja
Lakshmi,Narayana
Anusha,Reddy
Venkat,Rao
Praveen,Chowdary
Sowmya,Krishna
Kiran,Kumar
Manjula,Rao
Naveen,Prasad
Madhavi,Reddy
Srinivas,Rao
Ramya,Lakshmi
Venkatesh,Babu
Sujatha,Rani
Step 4: Create SQL Schema
Create schema.sql in src/main/resources/:
CREATE TABLE people (
id BIGINT AUTO_INCREMENT PRIMARY KEY,
first_name VARCHAR(255),
last_name VARCHAR(255)
);
Step 5: Spring Batch Configuration
Create BatchConfig.java to configure Spring Batch with a reader, processor and writer.
BatchConfig.java
package com.gfg.springbatchexample;
import org.springframework.batch.core.*;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.job.builder.JobBuilder;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.step.builder.StepBuilder;
import org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider;
import org.springframework.batch.item.database.JdbcBatchItemWriter;
import org.springframework.batch.item.database.builder.JdbcBatchItemWriterBuilder;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.builder.FlatFileItemReaderBuilder;
import org.springframework.batch.support.transaction.ResourcelessTransactionManager;
import org.springframework.boot.CommandLineRunner;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource;
import org.springframework.transaction.PlatformTransactionManager;
import javax.sql.DataSource;
@Configuration
@EnableBatchProcessing
public class BatchConfig {
@Bean
public FlatFileItemReader<Person> reader() {
return new FlatFileItemReaderBuilder<Person>()
.name("personItemReader")
.resource(new ClassPathResource("data.csv"))
.delimited()
.names("firstName", "lastName")
.fieldSetMapper(fieldSet -> {
Person person = new Person();
person.setFirstName(fieldSet.readString("firstName"));
person.setLastName(fieldSet.readString("lastName"));
return person;
})
.build();
}
@Bean
public JdbcBatchItemWriter<Person> writer(DataSource dataSource) {
return new JdbcBatchItemWriterBuilder<Person>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)")
.dataSource(dataSource)
.build();
}
@Bean
public Step step(JobRepository jobRepository, PlatformTransactionManager transactionManager,
FlatFileItemReader<Person> reader, JdbcBatchItemWriter<Person> writer) {
return new StepBuilder("step", jobRepository)
.<Person, Person>chunk(10, transactionManager)
.reader(reader)
.processor(new PersonItemProcessor())
.writer(writer)
.build();
}
@Bean
public Job job(JobRepository jobRepository, Step step) {
return new JobBuilder("importUserJob", jobRepository)
.start(step)
.build();
}
}
Step 6: Model Class
Person.java
package com.gfg.springbatchexample;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import lombok.Getter;
import lombok.Setter;
@Entity
@Getter
@Setter
public class Person {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
private String firstName;
private String lastName;
public Person() {}
public Person(String firstName, String lastName) {
this.firstName = firstName;
this.lastName = lastName;
}
@Override
public String toString() {
return "Person{firstName='" + firstName + "', lastName='" + lastName + "'}";
}
}
Step 7: Repository Interface
Create PersonRepository.java to access the database.
package com.gfg.springbatchexample;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
@Repository
public interface PersonRepository extends JpaRepository<Person, Long> {
}
Step 8: Add Job Completion Listener
Create JobCompletionNotificationListener.java to log results after batch completion.
JobCompletionNotificationListener.java
package com.gfg.springbatchexample;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.BatchStatus;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.listener.JobExecutionListenerSupport;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Component;
@Component
public class JobCompletionNotificationListener extends JobExecutionListenerSupport {
private static final Logger log = LoggerFactory.getLogger(JobCompletionNotificationListener.class);
private final JdbcTemplate jdbcTemplate;
@Autowired
public JobCompletionNotificationListener(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
@Override
public void afterJob(JobExecution jobExecution) {
if (jobExecution.getStatus() == BatchStatus.COMPLETED) {
log.info("Job completed. Verifying results...");
jdbcTemplate.query("SELECT first_name, last_name FROM people",
(rs, row) -> new Person(rs.getString(1), rs.getString(2)))
.forEach(person -> log.info("Found <" + person + "> in the database."));
}
}
}
Step 9: Processor Class
PersonItemProcessor.java
package com.gfg.springbatchexample;
import org.springframework.batch.item.ItemProcessor;
public class PersonItemProcessor implements ItemProcessor<Person, Person> {
@Override
public Person process(Person person) throws Exception {
String firstName = person.getFirstName().toUpperCase();
String lastName = person.getLastName().toUpperCase();
return new Person(firstName, lastName);
}
}
Step 11: Main Application
In this main class, add the @EnableBatchProcessing annotation to enable the functionalities of the Spring Boot project.
SpringBatchExampleApplication.java
package com.gfg.springbatchexample;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
@EnableBatchProcessing
public class SpringBatchExampleApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchExampleApplication.class, args);
}
}
Step 11: Run the application
Run the application and it will start on port 8080.
Spring Batch will read data from the CSV file, convert names to uppercase and insert them into the people table.
Console Logs Example:


Person Table Data in Database
