MetaGPT
MetaGPT copied to clipboard
Feature mute stream log for info level
Features
see issue : #1109
llm stream_response alway been print regardless of debug flag
I modified logs.py
rather than config2.py
.
This is because log is been set via define_log_level
rather than config.yaml
To keep things simple, a global variable _print_level
is introduced in logs.py
Influence
define_log_level(print_level="INFO")
will mute log_llm_stream
@azurewtl Thanks for your improvements. It works when I call define_log_level(print_level="DEBUG")
in my Python script entry if I want to print the LLM stream log. However, I'm considering whether to provide a simpler way to set it up, perhaps by passing environment variables METAGPT_LOG_PRINT_LEVEL
and METAGPT_LOG_FILE_LEVEL
?
@azurewtl Thanks for your improvements. It works when I call
define_log_level(print_level="DEBUG")
in my Python script entry if I want to print the LLM stream log. However, I'm considering whether to provide a simpler way to set it up, perhaps by passing environment variablesMETAGPT_LOG_PRINT_LEVEL
andMETAGPT_LOG_FILE_LEVEL
?
I thought about using config. however, my proposal is that debug flag should be pass via command line tools. Oftentimes, If we want to see debug info during development, I run program and pass debug flag such as:
python run.py --debug
Maybe YAML better off storage config that is NOT changed on ad-hoc basis. Otherwise if I want to show less or more info, I need to change config frequently.
@shenchucheng I think this is a new requirement and can be discussed additionally. This PR can be merged first @azurewtl If there are new changes, submit another PR
Anyway, I’ll merge first