raven-python
raven-python copied to clipboard
Add handler to celery task logger
Celery sets up 2 kinds of loggers: a global logger and task loggers. The celery docs include an example on how to use the task loggers, which makes task loggers look like the recommended way to log from celery tasks.
The register_logger_signal
hook in the sentry celery integration adds a handler to the global celery logger using the after_setup_logger
signal. This does not affect the task loggers, meaning that any exception logged by a task logger will not be captured by sentry.
This pull request modifies register_logger_signal
to connect to the after_setup_task_logger
signal as well, allowing exceptions logged by task loggers to be captured.
+1 for this, but maybe you should separate registering global logger handlers and task logger handlers
register_logger_signal(client, loglevel=logging.ERROR)
register_task_logger_signal(client, loglevel=logging.ERROR)
This can be done by passing celery signal as argument and creating some proxy-functions
...
from celery.signals import after_setup_logger, after_setup_task_logger
...
def _register_logger_signal(signal, client, logger=None, loglevel=logging.ERROR):
filter_ = CeleryFilter()
handler = SentryHandler(client)
handler.setLevel(loglevel)
handler.addFilter(filter_)
def process_logger_event(sender, logger, loglevel, logfile, format,
colorize, **kw):
# Attempt to find an existing SentryHandler, and if it exists ensure
# that the CeleryFilter is installed.
# If one is found, we do not attempt to install another one.
for h in logger.handlers:
if type(h) == SentryHandler:
h.addFilter(filter_)
return False
logger.addHandler(handler)
signal.connect(process_logger_event, weak=False)
def register_logger_signal(client, logger=None, loglevel=logging.ERROR):
_register_logger_signal(after_setup_logger, client, logger, loglevel)
def register_task_logger_signal(client, logger=None, loglevel=logging.ERROR):
_register_logger_signal(after_setup_task_logger, client, logger, loglevel)
@muravitskiy I'm not sure I understand the benefit of your approach. What's wrong with using the same handler for the global logger and the task loggers?
@omarkhan Celery separates this two loggers and you should be able to configure them separately
+1 for this as well. I was debugging why logger.error()
messages inside celery workers aren't propagated to sentry and reached the same conclusion.
Agreed, this change allowed me to properly log to sentry.io from within a celery task using logger.error()
. Trying to log error messages (not exceptions) to sentry from a celery task seems to fail otherwise with the usual setup:
https://docs.sentry.io/clients/python/integrations/celery/
Using
django==1.9.9
django-celery==3.1.17
raven==5.27.1
Thank you @omarkhan for this - it's a few years later, but we've taken this PR and attached a similar handler from the outside. Would be great to merge!