I am using Spring batch.
I have a reader with an input file that includes N lines with same format.
I read the file with chunk size and commit-interval at 100.
I want to match each row with occurences from the database.
For a single occurrence, I can appaired a large volume of data from database.
For example : GT;123;456;PICK
This previous record can refers to more than 600 occurrences of my database.
As a result, my program ended with an out of memory.
How can I ensure commit interval at 100 ? Do you have any recommendations in order to ensure size commit interval ?
Thanks by advance for your anwser.
This issue can happen even without Spring Batch. You need to make sure to allocate enough memory to hold a chunk of items. If you are enriching items from another data source, you need to make sure there is enough memory for items + additional data.