I'm new to spring. I'm working on a batch job that loads the data from a csv file to a sybase table. The input file is more than 5GB & has few millions of records.
I'm doing batch update using spring jdbctemplate. Below is my code snippet.
 public int[] batchUpdate(final CsvReader products) throws IOException,
        SQLException {
    int[] updateCounts = getJdbcTemplate().batchUpdate(sqlStatement,
            new AbstractInterruptibleBatchPreparedStatementSetter() {
                @Override
                public boolean setValuesIfAvailable(PreparedStatement pstmt, int i)
                        throws SQLException {
                    // logic to set the values to prepared statement etc...
                                    @Override
                public int getBatchSize() {
                    return batchSize;
                }
            });
            }
}
I'm using apache dbcp datasource. I set the batchsize to 2000. I did not change anything else in the defaults for auto-commits etc..
Now, when I run the job.. it takes 4.5 min on an avg to insert 2000 records & the job runs for 2 days(didn't completed yet).
Can anyone suggest how this can be optimized? Thanks in advance.