You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
according to the logic commit interval work as these:-
List items = new Arraylist();
for(int i = 0; i < commitInterval; i++){
Object item = itemReader.read()
Object processedItem = itemProcessor.process(item);
items.add(processedItem);
}
itemWriter.write(items);
BUt in actually it is not as specified in "org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(StepContribution, ChunkContext)" it first read to the full commit interval then process and then calls write. It is :-
List items = new Arraylist();
for(int i = 0; i < commitInterval; i++){
Object item = itemReader.read()
items.add(processedItem);
}for(int i = 0; i < commitInterval; i++){
Object processedItem = itemProcessor.process(item);
}
itemWriter.write(items);
Please look into this.
Affects: 3.0.7
Issue Links:
BATCH-2423 5.1 Chunk-Oriented Processing does not represent the actual implementation
("duplicates")
The text was updated successfully, but these errors were encountered:
sunny bindal opened BATCH-2534 and commented
according to the logic commit interval work as these:-
List items = new Arraylist();
for(int i = 0; i < commitInterval; i++){
Object item = itemReader.read()
Object processedItem = itemProcessor.process(item);
items.add(processedItem);
}
itemWriter.write(items);
BUt in actually it is not as specified in "org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(StepContribution, ChunkContext)" it first read to the full commit interval then process and then calls write. It is :-
List items = new Arraylist();
for(int i = 0; i < commitInterval; i++){
Object item = itemReader.read()
items.add(processedItem);
}for(int i = 0; i < commitInterval; i++){
Object processedItem = itemProcessor.process(item);
}
itemWriter.write(items);
Please look into this.
Affects: 3.0.7
Issue Links:
("duplicates")
The text was updated successfully, but these errors were encountered: