[Feature]: Improve LiteLLM Proxy Debug / Verbose Logs
The Feature
Goal
- I needed to ONLY see the original request, if 1 request failed why it failed (timeoutError), the next model it was falling back too
“LiteLLM verbose logs are not human readable “
Motivation, pitch
proxy user feedback
Twitter / LinkedIn details
No response
I am using litellm in proxy mode and when I try:
litellm --logs as per https://litellm.vercel.app/docs/proxy_server
I am getting the error message:
Usage: litellm [OPTIONS] Try 'litellm --help' for help.
Error: No such option: --logs (Possible options: --alias, --host, --local)
Is it still supported? Also there is no file api_logs.json in the current directory (while I do have errors showing up in the console)
this is deprecated @shuther it's on the old proxy docs. Are you just trying to view LLM input / output logs?
You can use our Langfuse calback https://litellm.vercel.app/docs/proxy/logging#logging-proxy-inputoutput---langfuse
I am not sure that langfuse is solving the log problem; I want to capture DEBUG/INFO messages, not always prompt. i.e. For test purposes, I setup langfuse with self hosting, and litellm hanged because the connection to the docker did not work (we could simulate network issue to the langfuse cloud offer). I think it is still important to copy some logs to a file ? For langfuse, I am not sure if this call could happen in the background so it does not block the reply to the user?
Also, do you support the environment variable LANGFUSE_HOST, I am getting some errors but I would like to confirm before I submit a bug report?
Yes we do support LANGFUSE_HOST @shuther