loguru icon indicating copy to clipboard operation
loguru copied to clipboard

[Question] How to use Loguru to save to json payload on gcp

Open TheLegendAli opened this issue 1 year ago • 3 comments

this is a code that I run on GCP:

import logging
from google.cloud.logging_v2 import Client
from google.cloud.logging_v2.handlers.handlers import CloudLoggingHandler
from google.cloud.logging_v2.handlers import setup_logging
client = Client()
handler = CloudLoggingHandler(client, name=LOG_NAME)
setup_logging(handler)
data_dict = {"hello": "world"}
logging.info("testing", extra={"json_fields": data_dict})

and the results show up like this Screenshot 2023-12-04 at 1 40 12 PM

where I can grab the JSON payload.

looking at the doc https://loguru.readthedocs.io/en/stable/api/logger.html#loguru._logger.Logger.contextualize

the best I can do is turn the dic into strings and put it as part of the message and not in the JSON payload. Is there a way to do it in loguru?

TheLegendAli avatar Dec 04 '23 21:12 TheLegendAli

Sorry but I don't understand what you're trying to achieve. In the screen you shared, all data you logged seems available in the jsonPayload on GCP. What do you think is missing or ill-formatted?

Delgan avatar Dec 04 '23 21:12 Delgan

Hi @Delgan , thanks so much your work on this awesome library.

I was trying to do something similar as @TheLegendAli , i.e. add a correlation-id (in my case, session_id and query_id to GCP logs, so that I can filter queries by these IDs.

Just wanted to share what I ended up with, after consulting your other suggestions on: https://github.com/Delgan/loguru/issues/789, https://github.com/Delgan/loguru/issues/812#issuecomment-1485272193 and also Loguru's docs

# in src/main.py - consider this a webserver
add_gcp_log_sink()

def handle_request(request_payload):
    add_metadata_to_logger(session_id=request_payload["session_id"], query_id=query_id=request_payload["query_id"])

    # do other stuff

# in src/logging_config.py
def add_gcp_log_sink(gcp_log_name="my-gcp-log-name") -> None:
    logging_client = logging.Client()
    gcp_logger = logging_client.logger(name=gcp_log_name)

    def log_to_gcp(message):
        gcp_logger.log_text(
            message,
            severity=message.record["level"].name,
            labels={
                "session_id": message.record["extra"]["session_id"],
                "query_id": message.record["extra"]["query_id"],
            },
        )

    # Prevent duplicate logs by adding GCP log sink only if loguru has 1 logger (i.e. the default stdout logger)
    if len(logger._core.handlers) == 1:
        logger.add(sink=log_to_gcp)


def add_metadata_to_logger(session_id: str, query_id: str) -> None:
    logger.configure(extra={"session_id": session_id, "query_id": query_id})

The result :-)

Screenshot 2024-06-20 at 2 12 38 PM

dtan2-wiq avatar Jun 20 '24 04:06 dtan2-wiq

Thanks for sharing @TheLegendAli.

It's probably possible to use bind() locally in handle_request() instead of configure(), it might be more thread-safe.

Delgan avatar Jun 23 '24 09:06 Delgan