LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

log metrics with info level

Open teto opened this issue 1 year ago • 4 comments

Description

Right now I have to use --debug to find out:

  • if I am using GPU
  • the speed of the inference

These 2 information are something I am always interested at but is hidden amoung the many messages of --debug.

I would like to dump these 2 information with an information level.

Notes for Reviewers I've never programmed in go so this is my first PR. I suspect it's all wrong and that the main.go logger has nothing to do with the grpc.go logger. Before proceeding, I thought I would just ask the maintainer what's the correct thing to do here:

  1. Is main.go loglevel known to backend/cpp/llama/grpc-server.cpp ? can it be known ?
  2. How can one set main.go log level from CLI ? Seems like there is just --debug but I would like to be able to set warning/error/info/debug

Signed commits

  • [ ] Yes, I signed my commits.

teto avatar Apr 03 '24 22:04 teto

Deploy Preview for localai ready!

Name Link
Latest commit d460b4c57405b74c380ed89865d52f0654e97935
Latest deploy log https://app.netlify.com/sites/localai/deploys/660dd7b89c16210008b13315
Deploy Preview https://deploy-preview-1954--localai.netlify.app
Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

netlify[bot] avatar Apr 03 '24 22:04 netlify[bot]

This is not a great way to expose metrics. In general, metrics should not be placed within log lines. This would be better handled as part of a larger feature to add more proper metrics capabilities in something like Prometheus/OTEL format.

Also, the usage of the Error log level here is semantically incorrect. Log levels have meaning, they are not just a hierarchy of "how frequently do I want to see this". If something is being logged at the Error level, it should be an actual error. If anything, this would be a debug level log, but again, metrics should probably be handled as part of a larger "lets do metrics properly" style feature.

cryptk avatar Apr 04 '24 04:04 cryptk

This is not a great way to expose metrics. In general, metrics should not be placed within log lines. This would be better handled as part of a larger feature to add more proper metrics capabilities in something like Prometheus/OTEL format.

a custom format is nice but increases the entry cost by requiring extra-infrastructure: why not in both ?

Also, the usage of the Error log level here is semantically incorrect.

Totally, I defaulted to Error to make sure it would appear regardless of the set log level but I think it didn't even appear.

teto avatar Apr 04 '24 08:04 teto

This is not a great way to expose metrics. In general, metrics should not be placed within log lines. This would be better handled as part of a larger feature to add more proper metrics capabilities in something like Prometheus/OTEL format.

a custom format is nice but increases the entry cost by requiring extra-infrastructure: why not in both ?

Also, the usage of the Error log level here is semantically incorrect.

Totally, I defaulted to Error to make sure it would appear regardless of the set log level but I think it didn't even appear.

It would hardly be a custom format, prometheus/OTEL is a very standardized format for metrics exposition. It's also fairly easy to understand as a human reading them.

You are also correct that the zerolog being used in main.go (and throughout the go codebase) is not able to be used inside the cpp codebases. They actually run as completely separate binaries and communicate with each other over GRPC. In order to have LocalAI log any data that comes from the GRPC backend, that data must be sent over the GRPC connection. I just tested and this change does not appear to compile with the following errors:

/home/cryptk/Documents/sourcecode/LocalAI/backend/cpp/llama/llama.cpp/examples/grpc-server/grpc-server.cpp: In member function ‘void llama_client_slot::print_timings() const’:
/home/cryptk/Documents/sourcecode/LocalAI/backend/cpp/llama/llama.cpp/examples/grpc-server/grpc-server.cpp:342:13: error: request for member ‘Info’ in ‘log’, which is of non-class type ‘double(double) noexcept’
  342 |         log.Info().Msgf("LocalAI version: %s", internal.PrintableVersion())
      |             ^~~~
/home/cryptk/Documents/sourcecode/LocalAI/backend/cpp/llama/llama.cpp/examples/grpc-server/grpc-server.cpp:342:48: error: ‘internal’ was not declared in this scope
  342 |         log.Info().Msgf("LocalAI version: %s", internal.PrintableVersion())

basically, those go variables/functions are not available inside the cpp codebase.

The data you are attempting to log is already being sent over that GRPC connection to be logged by LocalAI on L344

cryptk avatar Apr 04 '24 20:04 cryptk

Going to close this one per the review above

cryptk avatar Apr 29 '24 16:04 cryptk