qb icon indicating copy to clipboard operation
qb copied to clipboard

Chunking input/updates

Open Daemach opened this issue 3 years ago • 0 comments

I find myself constantly breaking up large inserts or updates. As a convenience feature, it would be nice to have this automated. I would like to updateorinsert 1000 records at a time, for example. I do it with .each() now, but it would be nice to avoid rewriting that code constantly. I would also like to insert 10000 rows at a time (from another database across a vpn so it has to be done via cf or bulk insert rather than dbase to dbase joins) without having to chunk based on param limits. Here's what I'm doing now:

// gainloss is the output from a stored proc converted to an array of structs
// SQLS only allows 2100 total params per call
if (gainLoss.len()){
  var chunkSize = floor(2100/gainLoss[1].keyArray().len())

  collect(gainLoss)
    .chunk(chunkSize)
    .get()
    .each((chunk)=>{
        utils.message('Gain Loss => chunking...')
        qb.newQuery()
          .from('mars_in_gain_loss')
          .insert(chunk)
    })    
}

Daemach avatar Mar 12 '21 20:03 Daemach