bull icon indicating copy to clipboard operation
bull copied to clipboard

Clean way to update repeatable job data?

Open gmcnaught opened this issue 7 years ago • 23 comments

The use case would be to update the data for the next run of a job in an atomic fashion, which currently seems to only be possible by using removeRepeatableJob and re-adding the repeatable job, and only while the job is not actively running.

gmcnaught avatar Oct 18 '17 21:10 gmcnaught

I will mark this issue as a future enhancement.

manast avatar Oct 18 '17 21:10 manast

I also need this. Has anyone found a way to update a repeatable job's data during processing?

lossau avatar Oct 27 '17 13:10 lossau

This is a really needed feature.

The documentation was not clear to me, and I was adding the job every time my node runs causing multiple jobs to run in every npm start.

I cannot use the @gmcnaught workaround because I don't allways know what is the old value for RepeatOpts. So I'm cleanning the queue before adding it in again, and using a jobId (which I have defined in a file of constants jobs_constants) to avoid duplicated queues:

import Queue from 'bull';

import { REMINDER_FOR_TRACKING_HOURS } from '../config/jobs_constants';

module.exports = (RedisUrl, RedisOptions) => {
  // Here is where I build the `queue` object and write all the jobs specifications.

  queue.empty();
  queue.add(null, {
    jobId: REMINDER_FOR_TRACKING_HOURS,
    repeat: { cron: '0 10 26-31 * *' }
  });
}

A addOrUpdate method would be really useful. Maybe with required jobId or some unique identifier.

abelosorio avatar Oct 31 '17 13:10 abelosorio

I will look into this as soon as I have a free slot.

manast avatar Oct 31 '17 14:10 manast

Any update on this?

mugli avatar Nov 27 '17 09:11 mugli

Having a job.update() working for repeatable would be really nice. I expose jobs to a graphQL but without any data in them ... :/

kizdolf avatar Jan 08 '18 10:01 kizdolf

I also need to update repeatable jobs.

LukaszWiktor avatar Oct 10 '18 13:10 LukaszWiktor

I think I've mentioned this in another thread:

We augmented Bull by storing job Data for repeatable jobs in a hash by jobName, that way we can atomically update the hash with the new job data, and then the first step in job.process pulls the hash data and updates the current job, so future jobs will have this data.

We've explored adding this in to the redis/lua automation to integrate it into the system directly and to reduce the number of network calls, but its not been a high priority. We have also considered handling delete in the same way - creating a queue of repeatable jobs that need to be deleted, and asynchronously deleting them on their next scheduled run.

gmcnaught avatar Oct 10 '18 14:10 gmcnaught

Thanks @gmcnaught!

I was trying to keep all the required information to run a job in its data, but it turns out I need to store it somewhere on the side.

LukaszWiktor avatar Oct 10 '18 14:10 LukaszWiktor

I've faced the same need. It would be awesome if we had an easiest way to perform updates on repeatable jobs.

sor-barroso avatar Feb 21 '19 21:02 sor-barroso

Any update on this? Does job.update() work for repeatable jobs as well?

crispinkoech avatar Nov 20 '19 15:11 crispinkoech

This feature would be very useful for the cron jobs where we need to query something since the last task invocation, it's a pity that this doesn't work by default

wasd171 avatar Sep 08 '20 11:09 wasd171

@manast Will this be implemented in BullMQ as well?

rileyai-dev avatar Nov 11 '20 12:11 rileyai-dev

Any progress on this issue?

MaximilianHollis avatar Feb 07 '22 04:02 MaximilianHollis

Any progress on this issue?

No

hf29h8sh321 avatar Mar 20 '22 03:03 hf29h8sh321

5 years since someone asked for it, it would be great to do it

RanKey1496 avatar Nov 24 '22 06:11 RanKey1496

In case anyone would find it useful:

export async function updateJobsOfQueueWithNewPattern(queue: Queue, pattern: string | undefined) {
    if (pattern) {
        log.info(`Update jobs of queue ${queue.name} with new pattern ${pattern}`);
        const connection = await queue.client;
        const keys = await connection.keys(`bull:${queue.name}:repeat:*:*`);
        const jobs = (await queue.getRepeatableJobs()).filter((job) => job.pattern !== pattern);
        for (const job of jobs) {
            log.info(`job: ${JSON.stringify(job)}`);
            let redisData;
            for (const key of keys) {
                const hash = await connection.hgetall(key);
                if (hash.name === job.name) {
                    redisData = hash.data;
                    break;
                }
            }
            if (redisData) {
                log.info(`Update job ${job.name} with new pattern ${pattern}`);
                const options = { repeat: { pattern }, jobId: job.id };
                await queue.removeRepeatableByKey(job.key);
                await addJob(job.name, queue, JSON.parse(redisData), options, false);
            }
        }
    }
}

A bit dirty but does what I wanted

paco-sisutech avatar Jun 12 '23 09:06 paco-sisutech

Also would love this feature :)

dlebee avatar Jun 12 '23 21:06 dlebee

It has been 7 years and this problem has not been solved. I want to update it by deleting and then creating, but I cannot obtain the old data through getRepeatableJobs. Please update it @manast

Pandaver avatar Apr 16 '24 02:04 Pandaver