Improve BatchGet / BatchWrite to iterate over 25 records.
Discussed in https://github.com/sensedeep/dynamodb-onetable/discussions/235
Originally posted by ebisbe January 2, 2022 I have a batch process that does 3 simple things. Query items ( I've limited to 500 ) from DDB, each item is sent to a queue and then deletes each item. I'd like to be able to do something like:
const Items = Model.find(..., {limit: 500})
let batch = {}
for(const Item of Items){
// send sqs message
model.remove(..., {batch})
}
table.batchWrite(batch)
But right now I need to do with batchWrite:
const Items = Model.find(..., {limit: 500})
const operations = []
for(const Item of Items){
// send sqs message
operations.push({...})
}
do {
let batch = {}
const operationsSlice = operations.slice(0, 24)
operationsSlice.forEach(async operation => await model.remove(operation, { batch }))
await table.batchWrite(batch)
} while (actions > 0)
It could be real handy that the batchGet/Write handles the limit imposed by DDB to 25 items by doing the slicing himself. Maybe add an option so the user acknowledges he is sending more than 25 items.
Merged.
Hi. Sorry to necrobump, but was this actually merged? I can't see relevant changes in the code or a pull request, and locally batchWrite() still fails with document limit when trying to submit more than 25 documents.
I think you are right. Perhaps got confused with another batch related PR.
I'll reopen and flag.