How to import CSV data to PostgreSQL Database using Spring Batch Job

Spring Batch is a powerful module to implement a batch process for tons of data conveniently.
This tutorial guide you how to import CSV Data to PostgreSQL Database using Spring Batch Job.

Related Articles:
How to start with Spring Batch using Spring Boot
How to use Spring JDBC Template for Postgres Database

I. Technology

– Java 1.8
– Maven 3.3.9
– Spring Tool Suite – Version 3.8.1.RELEASE
– Spring Boot: 1.4.0.RELEASE

II. Overview
1. Goal

To build a simple application that expresses how we can use Spring Batch with Job Step (including ItemReader, ItemProcessor, ItemWriter and JobExecutionListener) to read Customer Data from CSV file, then put them to PostgreSQL Table named ‘customer’.
– Reader: reads content of CSV file, then maps the read data to fields of DataModel Customer.
– Processor: converts each Customer record’s content to new content (for example, get Random ID and uppercase Name String) which will be written to Database Table.
– Writer: writes batch of records to PostgreSQL Database using DAO.
– Listener: handles after Step, read data from Database Table to verify and show LOGs.

– RestController: runs Job using JobLauncher, then return Complete String to Client.
– DAO: interacts with Database.

2. Project Structure


3. Step by step

– Create Spring Boot project
– Configure application properties
– Create a DataModel
– Create a DAO
– Create Job Step: Reader, Processor, Writer, Listener
– Create Batch Configuration
– Create a WebController
– Run Spring Boot Application & Enjoy Result

III. Practice
1. Create Spring Boot project

– Open Spring Tool Suite, on Menu, choose File -> New -> Spring Starter Project, then fill each fields:

Click Next, in:
I/O: choose Batch
SQL: choose JDBC and PostgreSQL
Web: choose Web

Click Finish. Spring Boot project will be created successfully.

2.Configure application properties, add SQL Script File & CSV file

– Add configuration for Datasource and disable Spring Batch Job auto-run in (locates in src/main/resources):

– under src/main/resources:
+ schema-postgresql.sql

+ customer-data.csv

3. Create a DataModel

Under package model, create class Customer.
Content of

4. Create a DAO

– Under package dao, create interface CustomerDao:

– Under package dao.impl, create implementation of CustomerDaoCustomerDaoImpl:

5. Create Job Step: Reader, Processor, Writer, Listener

Under package step:

JobExecutionListenerSupport is an implementation of JobExecutionListener.
We can make our own operations before start of a Step (override beforeJob method) and after its ending (normally or failed by overriding afterJob method).
The annotations corresponding to this interface are @BeforeStep and @AfterStep.

6. Create Batch Configuration

Under package config, create

The input parameter for method chunk of StepBuilder specifies the number of items to read before writing out via the ItemWriter.

7. Create a WebController

Under package controller, create

8. Run Spring Boot Application & Enjoy Result

– Config maven build:
clean install
– Run project with mode Spring Boot App
– Check results by access:

System shows:

Open phAdminIII to check table customer in database testcsvdb:

IV. Source Code


Related Posts

3 thoughts on “How to import CSV data to PostgreSQL Database using Spring Batch Job”

  1. this problem is happening:

    PreparedStatementCallback; bad SQL grammar [INSERT into BATCH_JOB_EXECUTION(JOB_EXECUTION_ID, JOB_INSTANCE_ID, START_TIME, END_TIME, STATUS, EXIT_CODE, EXIT_MESSAGE, VERSION, CREATE_TIME, LAST_UPDATED, JOB_CONFIGURATION_LOCATION) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]; nested exception is org.postgresql.util.PSQLException: ERROR: column “job_configuration_location” of relation “batch_job_execution” does not exist
    Posição: 159

  2. How to write unit tests and integrtaion tests for this projects, i have created created similair porject ..stuck with writing test case..please help

Got Something To Say:

Your email address will not be published. Required fields are marked *