arq icon indicating copy to clipboard operation
arq copied to clipboard

Manually retrying job from job result object

Open pratheekrebala opened this issue 5 years ago • 4 comments

Hey Sam,

Would you be open to adding functionality to manually trigger a retry on a job from a job result object?

Something like this, currently requires manually specifying queue_name but we can get this from the job result if you are ok also allowing #161

async def retry_job(job, queue_name='process_legislation'):
    # remove old job result
    await arq.delete(result_key_prefix + job.job_id)
    new_job = await arq.enqueue_job(
        job.function,
        *job.args,
        _queue_name=queue_name,
        _job_id=job.job_id,
        _defer_by=5)
    return new_job

pratheekrebala avatar Nov 02 '19 16:11 pratheekrebala

Yes, where would this function go?

samuelcolvin avatar Nov 02 '19 17:11 samuelcolvin

I was thinking it could also be a method on ArqRedis which accepts a JobResult object:

from arq.connections import create_pool
arq = await create_pool()
await arq.retry_job(result)

It would be nice to be able to call this directly from the result object: await result.retry() but that might get complicated since we would also have to keep track of which connection to use..

Do you have a preference?

pratheekrebala avatar Nov 03 '19 04:11 pratheekrebala

Yes, it should be a function on ArqRedis.

samuelcolvin avatar Nov 04 '19 10:11 samuelcolvin

I can give this a shot as I need this functionality on a service I am writing, specifically I need to be able to retry an expired job so I think this will do what I need.

ccharlesgb avatar Jul 29 '20 20:07 ccharlesgb