arq
arq copied to clipboard
Exception serialization improvement
When job failed, the result becomes an exception, which causes problems on serialization without pickle.
async def job_failed(exc: BaseException) -> None:
self.jobs_failed += 1
result_data_ = serialize_result(
function=function_name,
args=args,
kwargs=kwargs,
job_try=job_try,
enqueue_time_ms=enqueue_time_ms,
success=False,
result=exc,
start_ms=start_ms,
finished_ms=timestamp_ms(),
ref=f'{job_id}:{function_name}',
serializer=self.job_serializer,
queue_name=self.queue_name,
)
await asyncio.shield(self.finish_failed_job(job_id, result_data_))
Since pickle is the standard serializer, you are apparently using a custom serializer. So it is up to you how to serialize exceptions. Perhaps I misunderstand your question?
Since pickle is the standard serializer, you are apparently using a custom serializer. So it is up to you how to serialize exceptions. Perhaps I misunderstand your question?
How can I serialize exceptions to JSON when the job failed?
How to serialize an exception to JSON is a very generic question outside the scope of arq. StackOverflow has some suggestion.
I think what the OP meant to ask was how to serialize the exception when job.result()
is supposed to reraise it but can't due to serialization.
I'm on version 0.25.0. When job.result()
is called on a job
using the msgpack
deserializer, it immediately fails with arq.jobs.SerializationError: unable to serialize result
. From the code, deserialization happens before reraising. And since msgpack
cannot serialize an exception, info.result
will not be an Exception
.
As a serializer, msgpack
cannot deal with the case where the job failed due to an exception.