loguru icon indicating copy to clipboard operation
loguru copied to clipboard

Malformed JSON log records

Open mr2rm opened this issue 7 months ago • 1 comments

I have a FastAPI application served by Gunicorn that writes contextualized JSON logs to the file sink using patcher. Here is the simplified version of my logger's configuration:

import json
from loguru import logger

def attach_json_log(record):
    log_data = dict(
        timestamp=record["time"].isoformat(),
        application={
            "name": "My App",  # from env variable
            "version": "1.0.0",   # from env variable
        },
        level=record["level"].name,
        message=record["message"],
        extra=record["extra"],
    )
   record["extra"]["_json_"] = json.dumps(log_data)

logger.add(
    "/var/log/app.log",
    level="INFO",
    colorize=False,
    enqueue=True,
    rotation="1 days",
    retention="1 months",
    format=lambda _: "{extra[_json_]}\n",
)
logger.configure(patcher=attach_json_log)

The issue is sometimes the schema of the JSON log records in the log file is not matched with the defined schema. In some cases, some of the fields are missing from the JSON logs or all fields are available but they're filled with the wrong value. The strange thing is in all cases the logs written in the log file are still valid JSON objects and it doesn't make sense at all! For instance, here is a malformed log record that I found in the log file:

{
  "application": {
    "": "SOME OTHER VALUE",
    "name": "SOME OTHER VALUE"
  },
  "extra": {
    "x": null
  }
}

I noticed some issues are already reported for the enqueu=True option on Gunicorn workers. However, since my Gunicorn has 1 worker with 4 threads, I don't think this problem is related to those issues.

Any response would be appreciated. Thank you in advance for your effort.

mr2rm avatar Jul 31 '24 09:07 mr2rm